MCP Gemini
by drkhan107
This is a working demo of the Model Context Protocol (MCP) integrated with Google's Gemini. It provides a server and client implementation for interacting with Gemini using the MCP.
Last updated: N/A
What is MCP Gemini?
The Model Context Protocol (MCP) Gemini is a demonstration of how to integrate the MCP with Google's Gemini model. It includes a server that exposes Gemini's capabilities through the MCP and a client to interact with the server.
How to use MCP Gemini?
To use MCP Gemini, clone the repository, set up your Google API key in a .env file, install the required dependencies using pip, and then run the server (sse_server.py). Optionally, you can start the client (ssc_client.py) and the FastAPI server (fastapp.py) for GUI. Finally, launch the Streamlit app (app.py) to connect to the MCP server and interact with Gemini through a browser interface.
Key features of MCP Gemini
Integration with Google's Gemini
Server-Sent Events (SSE) based communication
Example client implementation
FastAPI integration for GUI
Streamlit app for user interface
Configurable port
Use cases of MCP Gemini
Demonstrating MCP integration with LLMs
Building AI-powered applications with Gemini
Prototyping conversational AI systems
Experimenting with context management in LLMs
Creating interactive AI experiences
FAQ from MCP Gemini
What is MCP?
What is MCP?
MCP stands for Model Context Protocol. It's a protocol for managing context in language models.
What is Gemini?
What is Gemini?
Gemini is Google's family of multimodal large language models.
Do I need a Google API key?
Do I need a Google API key?
Yes, you need a valid Google API key to use this demo.
What are the default ports?
What are the default ports?
The MCP server defaults to port 8080, and the Streamlit app defaults to port 8501.
How do I change the port for FastAPI?
How do I change the port for FastAPI?
Edit the fastapp.py file to change the port configuration.