MemGPT MCP Server
by Vic563
A TypeScript-based MCP server that implements a memory system for LLMs. It provides tools for chatting with different LLM providers while maintaining conversation history.
Last updated: N/A
What is MemGPT MCP Server?
The MemGPT MCP Server is a TypeScript-based server that acts as a memory system for Large Language Models (LLMs). It allows users to interact with various LLM providers while preserving conversation history, enabling more contextual and coherent interactions.
How to use MemGPT MCP Server?
To use the server, install the dependencies with npm install
, build it with npm run build
, and then configure it within Claude Desktop by adding the server config to the claude_desktop_config.json
file. Ensure you set the necessary environment variables for your chosen LLM providers (OPENAI_API_KEY, ANTHROPIC_API_KEY, OPENROUTER_API_KEY). You can then interact with the server through the provided tools like chat
, get_memory
, clear_memory
, use_provider
, and use_model
.
Key features of MemGPT MCP Server
Supports multiple LLM providers (OpenAI, Anthropic, OpenRouter, Ollama)
Maintains conversation history (memory)
Provides tools for managing memory (get_memory, clear_memory)
Allows switching between providers and models
Supports Claude 3 and 3.5 series models
Supports unlimited memory retrieval
Use cases of MemGPT MCP Server
Building chatbots with persistent memory
Creating AI assistants that remember past interactions
Developing conversational applications that require context
Experimenting with different LLM providers and models
Integrating memory capabilities into existing LLM workflows
FAQ from MemGPT MCP Server
How do I switch between LLM providers?
How do I switch between LLM providers?
Use the use_provider
tool and specify the desired provider (e.g., OpenAI, Anthropic, OpenRouter, Ollama).
How do I clear the conversation history?
How do I clear the conversation history?
Use the clear_memory
tool to remove all stored memories.
How do I retrieve the conversation history?
How do I retrieve the conversation history?
Use the get_memory
tool. You can specify a limit
parameter to retrieve a specific number of memories or use limit: null
to retrieve all memories.
What models are supported?
What models are supported?
The server supports provider-specific models, including Claude 3 series (Haiku, Sonnet, Opus), Claude 3.5 series (Haiku, Sonnet), OpenAI models (gpt-4o, gpt-4o-mini, gpt-4-turbo), OpenRouter models (any model in 'provider/model' format), and Ollama models (any locally available model).
How do I debug the MCP server?
How do I debug the MCP server?
Use the MCP Inspector by running npm run inspector
. This will provide a URL to access debugging tools in your browser.