MCP Conversation Server
by bsmi021
The MCP Conversation Server is a Model Context Protocol (MCP) server implementation for managing conversations with OpenRouter's language models. It provides a standardized interface for applications to interact with various language models through a unified conversation management system.
Last updated: N/A
MCP Conversation Server
A Model Context Protocol (MCP) server implementation for managing conversations with OpenRouter's language models. This server provides a standardized interface for applications to interact with various language models through a unified conversation management system.
Features
-
MCP Protocol Support
- Full MCP protocol compliance
- Resource management and discovery
- Tool-based interaction model
- Streaming response support
- Error handling and recovery
-
OpenRouter Integration
- Support for all OpenRouter models
- Real-time streaming responses
- Automatic token counting
- Model context window management
- Available models include:
- Claude 3 Opus
- Claude 3 Sonnet
- Llama 2 70B
- And many more from OpenRouter's catalog
-
Conversation Management
- Create and manage multiple conversations
- Support for system messages
- Message history tracking
- Token usage monitoring
- Conversation filtering and search
-
Streaming Support
- Real-time message streaming
- Chunked response handling
- Token counting
-
File System Persistence
- Conversation state persistence
- Configurable storage location
- Automatic state management
Installation
npm install mcp-conversation-server
Configuration
Configuration
All configuration for the MCP Conversation Server is now provided via YAML. Please update the config/models.yaml
file with your settings. For example:
# MCP Server Configuration
openRouter:
apiKey: "YOUR_OPENROUTER_API_KEY" # Replace with your actual OpenRouter API key.
persistence:
path: "./conversations" # Directory for storing conversation data.
models:
# Define your models here
'provider/model-name':
id: 'provider/model-name'
contextWindow: 123456
streaming: true
temperature: 0.7
description: 'Model description'
# Default model to use if none specified
defaultModel: 'provider/model-name'
Server Configuration
The MCP Conversation Server now loads all its configuration from the YAML file. In your application, you can load the configuration as follows:
const config = await loadModelsConfig(); // Loads openRouter, persistence, models, and defaultModel settings from 'config/models.yaml'
Note: Environment variables are no longer required as all configuration is provided via the YAML file.
Usage
Basic Server Setup
import { ConversationServer } from 'mcp-conversation-server';
const server = new ConversationServer(config);
server.run().catch(console.error);
Available Tools
The server exposes several MCP tools:
-
create-conversation
{ provider: 'openrouter', // Provider is always 'openrouter' model: string, // OpenRouter model ID (e.g., 'anthropic/claude-3-opus-20240229') title?: string; // Optional conversation title }
-
send-message
{ conversationId: string; // Conversation ID content: string; // Message content stream?: boolean; // Enable streaming responses }
-
list-conversations
{ filter?: { model?: string; // Filter by model startDate?: string; // Filter by start date endDate?: string; // Filter by end date } }
Resources
The server provides access to several resources:
-
conversation://{id}
- Access specific conversation details
- View message history
- Check conversation metadata
-
conversation://list
- List all active conversations
- Filter conversations by criteria
- Sort by recent activity
Development
Building
npm run build
Running Tests
npm test
Debugging
The server provides several debugging features:
-
Error Logging
- All errors are logged with stack traces
- Token usage tracking
- Rate limit monitoring
-
MCP Inspector
npm run inspector
Use the MCP Inspector to:
- Test tool execution
- View resource contents
- Monitor message flow
- Validate protocol compliance
-
Provider Validation
await server.providerManager.validateProviders();
Validates:
- API key validity
- Model availability
- Rate limit status
Troubleshooting
Common issues and solutions:
-
OpenRouter Connection Issues
- Verify your API key is valid
- Check rate limits on OpenRouter's dashboard
- Ensure the model ID is correct
- Monitor credit usage
-
Message Streaming Errors
- Verify model streaming support
- Check connection stability
- Monitor token limits
- Handle timeout settings
-
File System Errors
- Check directory permissions
- Verify path configuration
- Monitor disk space
- Handle concurrent access
Contributing
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
License
ISC License