Gemini Context MCP Server
by MCP-Mirror
A powerful MCP server leveraging Gemini's capabilities for context management and caching. It maximizes the value of Gemini's 2M token context window while providing tools for efficient caching of large contexts.
Last updated: N/A
What is Gemini Context MCP Server?
This is an MCP (Model Context Protocol) server implementation that utilizes Gemini's large language model to manage context and cache API responses. It allows for session-based conversations, semantic context search, and efficient reuse of large prompts, optimizing token usage and reducing costs.
How to use Gemini Context MCP Server?
To use this server, you need Node.js 18+ and a Gemini API key. Clone the repository, install dependencies, configure the .env file with your API key, build the server, and start it. You can then integrate it with MCP-compatible clients like Claude Desktop, Cursor, or VS Code, or use the provided test scripts or the Node.js API directly.
Key features of Gemini Context MCP Server
Up to 2M token context window support
Session-based conversations with smart context tracking
Semantic search for relevant context
Automatic context and cache cleanup
Large prompt caching for cost optimization
TTL management for cache expiration control
Use cases of Gemini Context MCP Server
Maintaining conversational context across multiple interactions
Efficiently reusing large system prompts and instructions
Reducing token usage costs for frequently used contexts
Integrating Gemini's capabilities into AI-enhanced development environments like Cursor
Creating AI assistants with memory and awareness of past conversations
FAQ from Gemini Context MCP Server
How do I get a Gemini API key?
How do I get a Gemini API key?
You can obtain a Gemini API key from the Google AI Studio: https://ai.google.dev/
What clients are compatible with this MCP server?
What clients are compatible with this MCP server?
This server is compatible with any MCP-compatible client, including Claude Desktop, Cursor, and VS Code with appropriate extensions.
How do I configure the server?
How do I configure the server?
The server is configured using environment variables, which can be set in a .env
file. Refer to the README for available options.
How can I use the caching system?
How can I use the caching system?
You can create caches for large system instructions and generate content using those caches. See the usage examples in the README for code snippets.
What are the available MCP tools?
What are the available MCP tools?
The server provides tools for context management (generate_text, get_context, clear_context, add_context, search_context) and caching (mcp_gemini_context_create_cache, mcp_gemini_context_generate_with_cache, mcp_gemini_context_list_caches, mcp_gemini_context_update_cache_ttl, mcp_gemini_context_delete_cache).