Memory MCP Server
by Sinhan88
The Memory MCP Server implements the Model Context Protocol (MCP) to provide long-term memory for Large Language Models (LLMs). It acts as a bridge, allowing LLMs to retain information over extended interactions.
Last updated: N/A
🧠Memory MCP Server
Memory MCP Server
GitHub Releases
Overview
Welcome to the Memory MCP Server! This project implements the Model Context Protocol (MCP) to provide long-term memory for Large Language Models (LLMs). With the growing need for more context-aware AI applications, this server acts as a bridge, allowing LLMs to retain information over extended interactions.
Table of Contents
Features
- Long-term Memory: Store and retrieve context for LLMs, enhancing their ability to provide relevant responses.
- Model Context Protocol: Adhere to the MCP standards for seamless integration with various LLM architectures.
- User-Friendly API: Easy-to-use API for developers to integrate memory functionalities into their applications.
- Scalability: Designed to handle multiple requests and scale with your needs.
Installation
To get started with the Memory MCP Server, follow these steps:
-
Clone the Repository:
git clone https://github.com/Sinhan88/memory-mcp-server.git
-
Navigate to the Project Directory:
cd memory-mcp-server
-
Install Dependencies:
Make sure you have Python and pip installed. Then run:
pip install -r requirements.txt
-
Run the Server:
Execute the following command to start the server:
python app.py
Usage
Once the server is running, you can interact with it using the API endpoints. Here’s a quick guide on how to use the Memory MCP Server.
API Endpoints
-
Store Memory: Send a POST request to
/store
with the following JSON body:{ "context": "Your context here", "model_id": "unique_model_identifier" }
-
Retrieve Memory: Send a GET request to
/retrieve?model_id=unique_model_identifier
.
Example Requests
Using curl
, you can test the API as follows:
-
Store Memory:
curl -X POST http://localhost:5000/store -H "Content-Type: application/json" -d '{"context": "I love programming.", "model_id": "model_1"}'
-
Retrieve Memory:
curl http://localhost:5000/retrieve?model_id=model_1
Contributing
We welcome contributions to the Memory MCP Server! If you want to help, please follow these steps:
-
Fork the Repository: Click the "Fork" button at the top right of this page.
-
Create a Branch:
git checkout -b feature/YourFeature
-
Make Your Changes: Edit the code, add features, or fix bugs.
-
Commit Your Changes:
git commit -m "Add your message here"
-
Push to the Branch:
git push origin feature/YourFeature
-
Open a Pull Request: Go to the original repository and click "New Pull Request".
License
This project is licensed under the MIT License. See the LICENSE file for details.
Contact
For any questions or feedback, feel free to reach out:
- Author: Sinhan88
- Email: [email protected]
- GitHub: Sinhan88
Releases
You can find the latest releases of the Memory MCP Server here. Please download and execute the appropriate file for your system.
Additional Resources
- Documentation: More detailed documentation is available in the
docs
folder. - Community: Join our discussion forum to share ideas and get help.
Acknowledgments
Thanks to the contributors and the community for their support. Special thanks to the developers of the libraries that made this project possible.
Topics
This repository covers various topics including:
- claude
- cursor
- cursor-ai
- cursorai
- llm
- llm-memory
- llms
- mcp
- mcp-server
- model-context-protocol
Feel free to explore these topics further!
Thank you for visiting the Memory MCP Server repository! We hope you find it useful for your projects involving LLMs and long-term memory.