Chroma MCP Server logo

Chroma MCP Server

by djm81

The Chroma MCP Server is a Model Context Protocol (MCP) server integration for Chroma, the open-source embedding database. It provides a persistent, searchable "working memory" for AI-assisted development workflows, enabling automated context recall and developer-managed persistence.

View on GitHub

Last updated: N/A

What is Chroma MCP Server?

The Chroma MCP Server connects AI applications with Chroma through the Model Context Protocol, allowing AI models to store, retrieve, and search embeddings, manage collections, and support Retrieval Augmented Generation (RAG) workflows.

How to use Chroma MCP Server?

Install the server using pip or UVX. Configure the server with command-line options or environment variables to specify the Chroma client type, data directory, log directory, and embedding function. Integrate with tools like Cursor by configuring the .cursor/mcp.json file.

Key features of Chroma MCP Server

  • Automated Context Recall

  • Developer-Managed Persistence

  • Semantic Search on Vector Data

  • Support for RAG Workflows

Use cases of Chroma MCP Server

  • AI-assisted development

  • Maintaining context across multiple sessions

  • Building a task-relevant knowledge base

  • Improving continuity of complex tasks

FAQ from Chroma MCP Server

What is Chroma?

Chroma is an open-source embedding database.

What is MCP?

MCP stands for Model Context Protocol.

What embedding functions are available?

Available embedding functions include 'default'/'fast', 'accurate', 'openai', 'cohere', 'huggingface', 'jina', 'voyageai', and 'gemini'.

How do I configure the server?

The server can be configured with command-line options or environment variables.

Where can I find more information?

See the API Reference, Getting Started guide, Cursor Integration guide, Developer Guide, and Embeddings and Thinking Tools Guide in the documentation.