MCP (Model Context Protocol) Server
by sangminpark9
MCP (Model Context Protocol) is designed to facilitate communication with Large Language Models (LLMs). It supports DeepSeek and Llama models through a unified API.
Last updated: N/A
What is MCP (Model Context Protocol) Server?
MCP is a protocol and server implementation that allows users to interact with different LLMs, such as DeepSeek and Llama, through a standardized API.
How to use MCP (Model Context Protocol) Server?
To use MCP, you need to clone the repository, install the necessary dependencies, and run the server. The README provides detailed instructions for setting up the environment, downloading models, and starting the server using either Python or Docker.
Key features of MCP (Model Context Protocol) Server
Supports DeepSeek and Llama models
Provides a unified API for LLM interaction
Allows context management for LLMs
Offers API endpoints for chat, model listing, and session management
Use cases of MCP (Model Context Protocol) Server
Building chatbot applications
Integrating LLMs into existing systems
Experimenting with different LLMs using a consistent interface
Developing AI-powered services
FAQ from MCP (Model Context Protocol) Server
How to add new models?
How to add new models?
- Add the model to the
app/models
directory. 2. Implement theBaseModel
interface. 3. Register the model inapp/models/router.py
. 4. Configure the model inapp/core/config.py
.
How to contribute?
How to contribute?
- Create a new branch. 2. Implement your changes. 3. Commit your changes. 4. Push your branch. 5. Create a Pull Request.
What are the API endpoints?
What are the API endpoints?
The API endpoints include /api/chat
for sending messages, /api/models
for listing available models, and /api/sessions/{session_id}
for deleting sessions.
What models are supported?
What models are supported?
Currently, DeepSeek and Llama models are supported.
What are the system requirements?
What are the system requirements?
Python 3.8+, CUDA enabled GPU (optional, CPU support available), Docker & Docker Compose (optional).