MCP Waifu Chat Server
by waifuai
This project implements a basic MCP server for a conversational AI "waifu" character. It uses the `mcp` library for Python to handle the protocol details and `FastMCP` for easy server setup.
Last updated: N/A
What is MCP Waifu Chat Server?
The MCP Waifu Chat Server is a Python-based implementation of a Model Context Protocol (MCP) server designed to facilitate conversational interactions with an AI character, often referred to as a "waifu". It leverages the mcp
and FastMCP
libraries to manage protocol specifics and streamline server configuration.
How to use MCP Waifu Chat Server?
To use the server, clone the repository, install the required dependencies using uv sync --all-extras --dev
, configure the Google Gemini API key and other settings via environment variables (or a .env
file), and then run the server using uv run mcp-waifu-chat
. Interact with the server through its API endpoints, such as /v1/server/status
for server status and the chat
tool for sending messages.
Key features of MCP Waifu Chat Server
User management (create, check existence, delete, count)
Dialog history storage (get, set, reset)
Basic chat functionality (using Google Gemini API)
Modular design for easy extension
Configuration via environment variables and API key file
SQLite database for persistence
Comprehensive unit tests
Use cases of MCP Waifu Chat Server
Creating a personal AI companion
Developing a conversational AI application
Experimenting with the Model Context Protocol
Building a chatbot with a specific persona
Integrating AI chat functionality into existing systems
FAQ from MCP Waifu Chat Server
Where do I get a Google Gemini API key?
Where do I get a Google Gemini API key?
You can obtain a key from Google AI Studio (https://aistudio.google.com/).
How do I configure the server?
How do I configure the server?
The server is configured using a combination of a file for the API key (~/.api-gemini
) and environment variables (or a .env
file) for other settings.
What database does the server use?
What database does the server use?
The server uses an SQLite database for persistence by default, but you can configure it to use PostgreSQL or MySQL for production deployments.
How do I run the unit tests?
How do I run the unit tests?
Run the unit tests using the command uv run pytest
.
How do I deploy the server to production?
How do I deploy the server to production?
For production deployment, use a production-ready WSGI/ASGI server like Gunicorn, consider a robust database like PostgreSQL or MySQL, implement proper logging, secure your server with HTTPS, and consider a reverse proxy like Nginx or Apache.