parquet_mcp_server
by DeepSpringAI
A powerful MCP server providing web search and similarity search functionalities. Designed to work with Claude Desktop, it enables applications to perform web searches and extract relevant information from previous searches.
Last updated: N/A
parquet_mcp_server
A powerful MCP (Model Control Protocol) server that provides tools for performing web searches and finding similar content. This server is designed to work with Claude Desktop and offers two main functionalities:
- Web Search: Perform a web search and scrape results
- Similarity Search: Extract relevant information from previous searches
This server is particularly useful for:
- Applications requiring web search capabilities
- Projects needing to find similar content based on search queries
Installation
Installing via Smithery
To install Parquet MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @DeepSpringAI/parquet_mcp_server --client claude
Clone this repository
git clone ...
cd parquet_mcp_server
Create and activate virtual environment
uv venv
.venv\Scripts\activate # On Windows
source .venv/bin/activate # On macOS/Linux
Install the package
uv pip install -e .
Environment
Create a .env
file with the following variables:
EMBEDDING_URL=http://sample-url.com/api/embed # URL for the embedding service
OLLAMA_URL=http://sample-url.com/ # URL for Ollama server
EMBEDDING_MODEL=sample-model # Model to use for generating embeddings
SEARCHAPI_API_KEY=your_searchapi_api_key
FIRECRAWL_API_KEY=your_firecrawl_api_key
VOYAGE_API_KEY=your_voyage_api_key
AZURE_OPENAI_ENDPOINT=http://sample-url.com/azure_openai
AZURE_OPENAI_API_KEY=your_azure_openai_api_key
Usage with Claude Desktop
Add this to your Claude Desktop configuration file (claude_desktop_config.json
):
{
"mcpServers": {
"parquet-mcp-server": {
"command": "uv",
"args": [
"--directory",
"/home/${USER}/workspace/parquet_mcp_server/src/parquet_mcp_server",
"run",
"main.py"
]
}
}
}
Available Tools
The server provides two main tools:
-
Search Web: Perform a web search and scrape results
- Required parameters:
queries
: List of search queries
- Optional parameters:
page_number
: Page number for the search results (defaults to 1)
- Required parameters:
-
Extract Info from Search: Extract relevant information from previous searches
- Required parameters:
queries
: List of search queries to merge
- Required parameters:
Example Prompts
Here are some example prompts you can use with the agent:
For Web Search:
"Please perform a web search for 'macbook' and 'laptop' and scrape the results from page 1"
For Extracting Info from Search:
"Please extract relevant information from the previous searches for 'macbook'"
Testing the MCP Server
The project includes a comprehensive test suite in the src/tests
directory. You can run all tests using:
python src/tests/run_tests.py
Or run individual tests:
# Test Web Search
python src/tests/test_search_web.py
# Test Extract Info from Search
python src/tests/test_extract_info_from_search.py
You can also test the server using the client directly:
from parquet_mcp_server.client import (
perform_search_and_scrape, # New web search function
find_similar_chunks # New extract info function
)
# Perform a web search
perform_search_and_scrape(["macbook", "laptop"], page_number=1)
# Extract information from the search results
find_similar_chunks(["macbook"])
Troubleshooting
- If you get SSL verification errors, make sure the SSL settings in your
.env
file are correct - If embeddings are not generated, check:
- The Ollama server is running and accessible
- The model specified is available on your Ollama server
- The text column exists in your input Parquet file
- If DuckDB conversion fails, check:
- The input Parquet file exists and is readable
- You have write permissions in the output directory
- The Parquet file is not corrupted
- If PostgreSQL conversion fails, check:
- The PostgreSQL connection settings in your
.env
file are correct - The PostgreSQL server is running and accessible
- You have the necessary permissions to create/modify tables
- The pgvector extension is installed in your database
- The PostgreSQL connection settings in your