MCP Gemini Server
by amitsh06
This project implements a server that follows the Model Context Protocol, allowing AI assistants to communicate with Google's Gemini models. With this MCP server, AI assistants can request text generation, text analysis, and maintain chat conversations through the Gemini API.
Last updated: N/A
What is MCP Gemini Server?
This is a server implementation of the Model Context Protocol (MCP) that enables AI assistants like Claude to interact with Google's Gemini API for text generation, analysis, and chat conversations.
How to use MCP Gemini Server?
- Clone the repository. 2. Create and activate a virtual environment. 3. Install dependencies. 4. Create a
.env
file with your Gemini API key. 5. Start the server usingpython server.py
. 6. Send MCP requests to the/mcp
endpoint using POST method.
Key features of MCP Gemini Server
Client-Server Communication (MCP protocol)
Message Processing
Error Handling & Logging
Environment Variables Support
API Testing & Debugging
Use cases of MCP Gemini Server
Integrating Gemini API with AI assistants
Text generation tasks
Text analysis tasks (sentiment, summary, keywords)
Chatbot development
FAQ from MCP Gemini Server
What is the Model Context Protocol (MCP)?
What is the Model Context Protocol (MCP)?
MCP is a protocol that enables AI assistants to interact with models like Gemini.
What is the main endpoint for MCP requests?
What is the main endpoint for MCP requests?
The main endpoint is /mcp
using the POST method.
What are the available MCP actions?
What are the available MCP actions?
The available actions are generate_text
, analyze_text
, and chat
.
How do I provide my Gemini API key?
How do I provide my Gemini API key?
You need to create a .env
file in the root directory and set the GEMINI_API_KEY
variable.
How can I test the server?
How can I test the server?
You can use the included test_client.py
script to test various functionalities.