Memory MCP Server logo

Memory MCP Server

by Sinhan88

The Memory MCP Server implements the Model Context Protocol (MCP) to provide long-term memory for Large Language Models (LLMs). It acts as a bridge, allowing LLMs to retain information over extended interactions.

View on GitHub

Last updated: N/A

What is Memory MCP Server?

The Memory MCP Server is a server that provides long-term memory for Large Language Models (LLMs) by implementing the Model Context Protocol (MCP). It allows LLMs to retain information across multiple interactions, enhancing their ability to provide relevant and context-aware responses.

How to use Memory MCP Server?

To use the Memory MCP Server, first clone the repository, install the dependencies using pip install -r requirements.txt, and then run the server using python app.py. You can then interact with the server's API endpoints to store and retrieve memory using POST and GET requests, respectively. Example curl commands are provided in the README.

Key features of Memory MCP Server

  • Long-term Memory

  • Model Context Protocol

  • User-Friendly API

  • Scalability

Use cases of Memory MCP Server

  • Enhancing chatbot memory

  • Improving context-aware AI applications

  • Building more personalized AI experiences

  • Enabling LLMs to handle complex, multi-turn conversations

FAQ from Memory MCP Server

What is the Model Context Protocol (MCP)?

The Model Context Protocol (MCP) is a standard for integrating memory functionalities with various LLM architectures.

How do I store memory in the server?

Send a POST request to the /store endpoint with a JSON body containing the context and model ID.

How do I retrieve memory from the server?

Send a GET request to the /retrieve endpoint with the model ID as a query parameter.

What license is the project under?

This project is licensed under the MIT License.

How can I contribute to the project?

Fork the repository, create a branch, make your changes, commit them, push to the branch, and open a pull request.