MCP Server for Qdrant
by RollandMELET
This project provides an MCP (Model Context Protocol) server to allow Claude to access a Qdrant vector database deployed on a VPS. It's deployed via Docker and configured to expose the Qdrant server to Claude.
Last updated: N/A
What is MCP Server for Qdrant?
This is an MCP server that acts as a bridge between the Claude AI assistant and a Qdrant vector database. It allows Claude to retrieve relevant information from the Qdrant database based on the context of the conversation.
How to use MCP Server for Qdrant?
- Deploy the Qdrant server (e.g., using Coolify). 2. Configure the MCP server using environment variables (QDRANT_URL, QDRANT_API_KEY, COLLECTION_NAME, EMBEDDING_MODEL). 3. Deploy the MCP server using Docker. 4. Configure Claude Desktop to use the MCP server by updating the
claude_desktop_config.json
file with the MCP server's URL and transport type (SSE).
Key features of MCP Server for Qdrant
Exposes Qdrant to Claude via MCP
Dockerized deployment
Configurable via environment variables
SSE support for remote connection
Uses sentence-transformers for embeddings
Use cases of MCP Server for Qdrant
Enabling Claude to answer questions based on data stored in Qdrant
Providing Claude with contextual information from a vector database
Integrating Claude with knowledge bases and document repositories
Building AI-powered applications that leverage vector search and Claude's language capabilities
FAQ from MCP Server for Qdrant
What is MCP?
What is MCP?
MCP stands for Model Context Protocol. It's a protocol that allows AI models like Claude to access external data sources.
What is Qdrant?
What is Qdrant?
Qdrant is a vector similarity search engine. It allows you to store and search for vectors based on their similarity.
How do I configure the QDRANT_URL?
How do I configure the QDRANT_URL?
The QDRANT_URL should be the URL of your Qdrant server, including the port.
What is the purpose of the QDRANT_API_KEY?
What is the purpose of the QDRANT_API_KEY?
The QDRANT_API_KEY is used to authenticate with your Qdrant server.
What embedding model should I use?
What embedding model should I use?
The default embedding model is sentence-transformers/all-MiniLM-L6-v2, which is a good general-purpose model. You can choose a different model if you have specific requirements.