MCP Gemini Server logo

MCP Gemini Server

by bsmi021

This project provides an MCP server that wraps the `@google/genai` SDK, exposing Google's Gemini model capabilities as standard MCP tools. It simplifies integration with Gemini models by providing a consistent, tool-based interface managed via the MCP standard.

View on GitHub

Last updated: N/A

What is MCP Gemini Server?

The MCP Gemini Server is a dedicated server that exposes Google's Gemini model capabilities as standard MCP (Model Context Protocol) tools. It allows other LLMs or MCP-compatible systems to leverage Gemini's features as a backend workhorse, simplifying integration with Gemini models through a consistent, tool-based interface.

How to use MCP Gemini Server?

To use the server, you need Node.js (v18 or later) and a Google AI Studio API key. Install the server manually or via Smithery, configure your MCP client (e.g., Cline, Claude Desktop) with the server's command and API key, and then restart your MCP client. You can then use the available MCP tools via <use_mcp_tool> tags with appropriate arguments.

Key features of MCP Gemini Server

  • Core Generation (text generation and streaming)

  • Function Calling (enables Gemini to request client-defined functions)

  • Stateful Chat (manages conversational context)

  • File Handling (upload, list, retrieve, delete files)

  • Caching (create, list, retrieve, update, delete cached content)

Use cases of MCP Gemini Server

  • Integrating Gemini's text generation capabilities into other LLM workflows.

  • Using Gemini for function calling within an MCP environment.

  • Building stateful chat applications powered by Gemini.

  • Managing files and cached content within a Gemini-powered system.

  • Leveraging Gemini's models through a standardized MCP interface.

FAQ from MCP Gemini Server

What is the Model Context Protocol (MCP)?

MCP is a standard for interacting with language models as tools, allowing different LLMs and systems to communicate and leverage each other's capabilities.

What is the difference between Google AI Studio API keys and Vertex AI credentials?

The File Handling and Caching APIs are only compatible with Google AI Studio API keys and are not supported when using Vertex AI credentials. This server does not currently support Vertex AI authentication.

How do I handle errors when using the server?

The server returns structured errors using the MCP standard McpError type, which includes a code, message, and optional details for troubleshooting.

What are the required parameters for the gemini_uploadFile tool?

The gemini_uploadFile tool requires the filePath parameter, which must be an absolute path to the file on the server's file system.

Is true streaming supported for gemini_generateContentStream?

No, the current implementation uses a workaround and collects all chunks before returning the full text. True streaming to the MCP client is not yet implemented.