Code Assistant
by stippi
Code Assistant is a CLI tool built in Rust designed to assist with various code-related tasks, such as codebase exploration, file manipulation, and summarization. It can also run as a Model Context Protocol server for integration with LLM clients like Claude Desktop.
Last updated: N/A
What is Code Assistant?
Code Assistant is a command-line tool and MCP server that helps developers automate code-related tasks using Large Language Models (LLMs). It allows for autonomous exploration of codebases, file reading/writing, and integration with LLM clients.
How to use Code Assistant?
Code Assistant can be used in two modes: Agent Mode (default) and Server Mode. Agent Mode allows you to perform specific tasks on a codebase using command-line arguments. Server Mode runs the tool as an MCP server, allowing it to be used as a plugin with LLM clients like Claude Desktop. Configuration involves setting up project paths and MCP server settings.
Key features of Code Assistant
Autonomous Exploration
Reading/Writing Files
Working Memory Management
File Summarization
Interactive Communication
MCP Server Mode
Use cases of Code Assistant
Analyzing code complexity
Listing API endpoints
Finding TODO comments
Documenting public APIs
Optimizing database queries
Integrating with LLM clients for code understanding and modification
FAQ from Code Assistant
What is the Model Context Protocol (MCP)?
What is the Model Context Protocol (MCP)?
The Model Context Protocol is a standard developed by Anthropic that allows LLMs to interact with external tools and resources.
How do I configure Code Assistant to work with Claude Desktop?
How do I configure Code Assistant to work with Claude Desktop?
You need to configure your projects in ~/.config/code-assistant/projects.json
and add the Code Assistant server to the claude_desktop_config.json
file in Claude Desktop's settings.
What LLM providers are supported?
What LLM providers are supported?
The tool supports Anthropic, OpenAI, Vertex AI, Ollama, and AI Core.
What are the different agent modes?
What are the different agent modes?
The tool supports working_memory
and message_history
agent modes, which affect how the agent manages context and interacts with the LLM.
How do I provide API keys for the different LLM providers?
How do I provide API keys for the different LLM providers?
API keys are typically provided through environment variables such as ANTHROPIC_API_KEY
, OPENAI_API_KEY
, GOOGLE_API_KEY
, and OPENROUTER_API_KEY
.