CodeCompass
by alvinveroy
CodeCompass is an AI coding assistant for Vibe Coding, leveraging the Model Context Protocol (MCP) to connect Git repositories to AI assistants. It can run locally with Ollama for privacy or be configured with OpenAI for cloud power, integrating with VSCode, Cursor, and Claude for seamless development.
Last updated: N/A
CodeCompass: Your AI-Powered Vibe Coding Companion with MCP
Introduction: CodeCompass is an AI coding assistant for Vibe Coding, leveraging the Model Context Protocol (MCP) to connect Git repositories to AI assistants. Run locally with Ollama for privacy or configure with OpenAI for cloud power. Integrate with VSCode, Cursor, and Claude for seamless development.
Struggling to debug complex code or implement new features? CodeCompass transforms your Git repositories into an AI-driven knowledge base, empowering Vibe Coding—a revolutionary approach where you describe tasks in natural language, and AI generates code. As a Model Context Protocol (MCP) server, CodeCompass connects AI assistants like Claude to your codebase, delivering context-aware coding assistance. Built with Qdrant for vector storage and Ollama for local privacy, it’s configurable to use cloud models like OpenAI.
A cornerstone of the Vibe coder arsenal, CodeCompass streamlines debugging, feature implementation, and codebase exploration. Star the CodeCompass GitHub and join the future of AI-driven development!
What is Vibe Coding?
Vibe Coding, coined by Andrej Karpathy in February 2025, lets developers use natural language prompts to instruct AI to generate code. Describe the “vibe” of your project—like “build a login page”—and AI delivers, making coding accessible to all. CodeCompass enhances this by providing AI assistants with deep repository context.
What is the Model Context Protocol (MCP)?
The Model Context Protocol (MCP) is an open standard by Anthropic that connects AI assistants to data sources, such as Git repositories, for relevant responses. CodeCompass implements MCP to serve codebase data, enabling precise AI-driven coding assistance for Vibe Coding.
Why Choose CodeCompass?
- Local Privacy with Ollama: Runs models locally by default for data security, ideal for sensitive projects.
- Cloud Flexibility: Configurable to use online models like OpenAI or DeepSeek for enhanced performance.
- Agentic RAG: An AI agent autonomously retrieves code, documentation, and metadata for comprehensive answers.
- Vibe Coding Ready: Supports natural language prompts for intuitive code generation.
- Developer-Friendly: Integrates with VSCode, Cursor, Zed, Claude Desktop, and more.
- Metrics & Diagnostics: Built-in tools for tracking performance and diagnosing issues.
- Project Management: Integration with TaskMaster AI for task tracking and management.
- Knowledge Graph: Build and query knowledge graphs about your codebase with MCP Memory.
- Library Documentation: Access up-to-date documentation for libraries with Context7.
- Project Management: Integration with TaskMaster AI for task tracking and management.
- Knowledge Graph: Build and query knowledge graphs about your codebase with MCP Memory.
- Library Documentation: Access up-to-date documentation for libraries with Context7.
Installation
Prerequisites
- Node.js (v20+)
- TypeScript (v5+)
- Docker (for Qdrant)
- Ollama (for local models:
nomic-embed-text:v1.5
,llama3.1:8b
) or OpenAI API key (for cloud models) - A local Git repository
Setup
- Start Qdrant:
docker run -d -p 127.0.0.1:6333:6333 qdrant/qdrant
- Set Up AI Models:
- Local (Ollama):
ollama pull nomic-embed-text:v1.5 ollama pull llama3.1:8b ollama serve
- Cloud (OpenAI): Obtain an OpenAI API key.
- Local (Ollama):
Installation Options
- Clone and Install:
git clone https://github.com/alvinveroy/CodeCompass cd codecompass npm install npm run build
- Using npx:
npx @alvinveroy/codecompass
- Using Docker:
docker pull alvinveroy/codecompass:latest docker run -p 3000:3000 -v /path/to/your/repo:/app/repo alvinveroy/codecompass
Configuration
Set environment variables in your shell, MCP client, or .env
file:
| Variable | Default Value | Description |
|---------------------------|-----------------------------------|------------------------------------------|
| LLM_PROVIDER
| ollama
| AI provider (ollama
, openai
, or deepseek
) |
| OLLAMA_HOST
| http://localhost:11434
| Ollama server address (for ollama
) |
| OPENAI_API_KEY
| None | OpenAI API key (for openai
) |
| DEEPSEEK_API_KEY
| None | DeepSeek API key (for deepseek
) |
| DEEPSEEK_API_URL
| Default DeepSeek API endpoint | Custom DeepSeek API endpoint (optional) |
| QDRANT_HOST
| http://localhost:6333
| Qdrant server address |
| EMBEDDING_MODEL
| nomic-embed-text:v1.5
| Embedding model (Ollama) |
| SUGGESTION_MODEL
| llama3.1:8b
| Suggestion model (Ollama) |
| OPENAI_EMBEDDING_MODEL
| text-embedding-ada-002
| Embedding model (OpenAI) |
| OPENAI_SUGGESTION_MODEL
| gpt-4o
| Suggestion model (OpenAI) |
| MCP_PORT
| 3000
| MCP server port |
| LOG_LEVEL
| info
| Logging level |
Example .env for OpenAI:
LLM_PROVIDER=openai
OPENAI_API_KEY=sk-xxx
QDRANT_HOST=http://localhost:6333
MCP_PORT=3000
OPENAI_EMBEDDING_MODEL=text-embedding-ada-002
OPENAI_SUGGESTION_MODEL=gpt-4o
Example .env for DeepSeek:
LLM_PROVIDER=deepseek
DEEPSEEK_API_KEY=sk-xxx
QDRANT_HOST=http://localhost:6333
MCP_PORT=3000
Setting DeepSeek API Key via CLI:
npm run set-deepseek-key YOUR_API_KEY
Usage
Interact with CodeCompass via MCP using tools optimized for Vibe Coding:
- View Repository Structure:
server.resource("repo://structure")
- Search Code:
server.tool("search_code", { query: "authentication endpoint" })
- Get Repository Context:
server.tool("get_repository_context", { query: "Add user login" })
- Generate Suggestion:
server.tool("generate_suggestion", { query: "Fix null pointer in auth", code: "..." })
- Agent Query (Multi-step reasoning):
server.tool("agent_query", { query: "How does the authentication flow work?", maxSteps: 5 })
- Check Provider Status:
server.tool("check_provider", { verbose: true })
- Switch Models:
server.tool("switch_suggestion_model", { model: "llama3.1:8b" })
- Access Changelog:
server.tool("get_changelog", {})
- Manage Tasks with TaskMaster:
server.tool("taskmaster-ai", "get_tasks", { projectRoot: "/path/to/project" })
- Build Knowledge Graph:
server.tool("@modelcontextprotocol/memory", "create_entities", { entities: [...] })
- Get Library Documentation:
server.tool("context7", "get-library-docs", { context7CompatibleLibraryID: "vercel/nextjs" })
- Access Changelog:
server.tool("get_changelog", {})
- Manage Tasks with TaskMaster:
server.tool("taskmaster-ai", "get_tasks", { projectRoot: "/path/to/project" })
- Build Knowledge Graph:
server.tool("@modelcontextprotocol/memory", "create_entities", { entities: [...] })
- Get Library Documentation:
server.tool("context7", "get-library-docs", { context7CompatibleLibraryID: "vercel/nextjs" })
Vibe Coding Example
Scenario: You want to implement user authentication.
- Prompt AI: Tell your assistant (e.g., Claude), “Add OAuth authentication to my app.”
- Context Retrieval: CodeCompass fetches relevant code and documentation via MCP.
- AI Response: Suggests OAuth implementation with code snippets tailored to your repository.
- Refine: Ask, “Use Google OAuth,” and CodeCompass updates the context for a refined suggestion.
Integration with Development Tools
CodeCompass integrates seamlessly with popular IDEs and tools, enhancing your Vibe Coding workflow. Below are detailed setup instructions.
Cursor
- Install Cursor.
- Open Cursor settings (
cursor.json
). - Add CodeCompass as a custom command:
{ "commands": [ { "name": "CodeCompass", "command": "npx", "args": ["-y", "@alvinveroy/codecompass@latest"], "env": { "LLM_PROVIDER": "ollama", "OLLAMA_HOST": "http://localhost:11434", "QDRANT_HOST": "http://localhost:6333" } } ] }
- For OpenAI, update
env
:{ "env": { "LLM_PROVIDER": "openai", "OPENAI_API_KEY": "sk-xxx", "QDRANT_HOST": "http://localhost:6333" } }
- Use via Cursor’s AI interface: Prompt, “Debug my login function,” and CodeCompass provides context.
VSCode
- Install VSCode and the Codeium extension for AI support.
- Create a
.vscode/settings.json
file:{ "codeium.customCommands": [ { "name": "CodeCompass", "command": "npx", "args": ["-y", "@alvinveroy/codecompass@latest"], "env": { "LLM_PROVIDER": "ollama", "OLLAMA_HOST": "http://localhost:11434", "QDRANT_HOST": "http://localhost:6333" } } ] }
- For OpenAI, modify
env
as above. - Access via Codeium’s chat: Ask, “Suggest a REST API structure,” and CodeCompass enhances the response.
Windsurf
- Install Windsurf (AI-powered IDE).
- Configure Windsurf’s settings (
windsurf.json
):{ "customTools": [ { "name": "CodeCompass", "command": "npx", "args": ["-y", "@alvinveroy/codecompass@latest"], "env": { "LLM_PROVIDER": "ollama", "OLLAMA_HOST": "http://localhost:11434", "QDRANT_HOST": "http://localhost:6333" } } ] }
- For OpenAI, update
env
accordingly. - Prompt Windsurf’s AI: “Explore my codebase for database models,” and CodeCompass provides context.
Zed
- Install Zed.
- Configure Zed’s settings (
settings.json
):{ "assistant": { "custom_commands": [ { "name": "CodeCompass", "command": "npx", "args": ["-y", "@alvinveroy/codecompass@latest"], "env": { "LLM_PROVIDER": "ollama", "OLLAMA_HOST": "http://localhost:11434", "QDRANT_HOST": "http://localhost:6333" } } ] } }
- For OpenAI, adjust
env
. - Use Zed’s assistant: Ask, “Implement a user profile page,” and CodeCompass supplies relevant data.
Claude Desktop
- Install Claude Desktop (if available).
- Configure via a custom script or
.env
file:LLM_PROVIDER=ollama OLLAMA_HOST=http://localhost:11434 QDRANT_HOST=http://localhost:6333
- Run CodeCompass:
npx @alvinveroy/codecompass
- For OpenAI, update
.env
withOPENAI_API_KEY
. - Prompt Claude: “Fix my API endpoint,” and CodeCompass enhances the response via MCP.
Claude Code
- Use Claude Code within supported IDEs or standalone.
- Install via Smithery:
npx -y @smithery/cli install @alvinveroy/codecompass --client claude
- Configure environment variables in your IDE or
.env
as above. - Prompt: “Generate a GraphQL schema,” and CodeCompass provides context.
Use Cases
| Use Case | Description | Benefit for Vibe Coding | |-------------------------|-----------------------------------------------------------------------------|----------------------------------------------------| | Debugging | Query AI to identify and fix code errors. | Fast, context-aware solutions reduce downtime. | | Feature Implementation | Describe features for AI-generated code. | Accelerates development with tailored suggestions. | | Code Exploration | Navigate codebases with natural language queries. | Simplifies large project understanding. | | Onboarding | Provide new developers with AI-driven codebase insights. | Eases integration with contextual explanations. |
Diagnostics and Troubleshooting
CodeCompass includes several diagnostic tools to help troubleshoot issues:
- Reset Metrics: Clear all performance counters
server.tool("reset_metrics", {})
- Debug Provider: Test provider configuration
server.tool("debug_provider", {})
- Model Switch Diagnostic: Diagnose model switching issues
server.tool("model_switch_diagnostic", {})
- Get Changelog: View version history
server.tool("get_changelog", {})
- Get Session History: View detailed session information
server.tool("get_session_history", { sessionId: "your-session-id" })
- Get Session History: View detailed session information
server.tool("get_session_history", { sessionId: "your-session-id" })
Why CodeCompass for the Vibe Coder Arsenal?
CodeCompass is a must-have in the Vibe coder arsenal, a collection of tools for AI-driven development. By implementing MCP, it connects your repository to AI assistants, enabling Vibe Coding with:
- Privacy-First: Local Ollama models keep data secure.
- Flexible AI: Supports cloud models like OpenAI and DeepSeek for versatility.
- Seamless Integration: Enhances IDEs for efficient workflows.
- Democratized Coding: Makes coding accessible via natural language.
- Metrics & Diagnostics: Built-in tools for performance monitoring and troubleshooting.
- Project Management: Integrated TaskMaster AI for comprehensive project tracking.
- Knowledge Representation: Build and query knowledge graphs about your codebase.
- Documentation Access: Retrieve up-to-date library documentation with Context7.
- Project Management: Integrated TaskMaster AI for comprehensive project tracking.
- Knowledge Representation: Build and query knowledge graphs about your codebase.
- Documentation Access: Retrieve up-to-date library documentation with Context7.
Contributing
Join our community! See CONTRIBUTING.md for guidelines.