Sponsored by Deepsite.site

🧠 Memory MCP Server

Created By
Sinhan888 months ago
A MCP (Model Context Protocol) server providing long-term memory for LLMs
Content

🧠 Memory MCP Server

Memory MCP Server GitHub Releases


Overview

Welcome to the Memory MCP Server! This project implements the Model Context Protocol (MCP) to provide long-term memory for Large Language Models (LLMs). With the growing need for more context-aware AI applications, this server acts as a bridge, allowing LLMs to retain information over extended interactions.

Table of Contents

Features

  • Long-term Memory: Store and retrieve context for LLMs, enhancing their ability to provide relevant responses.
  • Model Context Protocol: Adhere to the MCP standards for seamless integration with various LLM architectures.
  • User-Friendly API: Easy-to-use API for developers to integrate memory functionalities into their applications.
  • Scalability: Designed to handle multiple requests and scale with your needs.

Installation

To get started with the Memory MCP Server, follow these steps:

  1. Clone the Repository:

    git clone https://github.com/Sinhan88/memory-mcp-server.git
    
  2. Navigate to the Project Directory:

    cd memory-mcp-server
    
  3. Install Dependencies:

    Make sure you have Python and pip installed. Then run:

    pip install -r requirements.txt
    
  4. Run the Server:

    Execute the following command to start the server:

    python app.py
    

Usage

Once the server is running, you can interact with it using the API endpoints. Here’s a quick guide on how to use the Memory MCP Server.

API Endpoints

  • Store Memory: Send a POST request to /store with the following JSON body:

    {
        "context": "Your context here",
        "model_id": "unique_model_identifier"
    }
    
  • Retrieve Memory: Send a GET request to /retrieve?model_id=unique_model_identifier.

Example Requests

Using curl, you can test the API as follows:

  1. Store Memory:

    curl -X POST http://localhost:5000/store -H "Content-Type: application/json" -d '{"context": "I love programming.", "model_id": "model_1"}'
    
  2. Retrieve Memory:

    curl http://localhost:5000/retrieve?model_id=model_1
    

Contributing

We welcome contributions to the Memory MCP Server! If you want to help, please follow these steps:

  1. Fork the Repository: Click the "Fork" button at the top right of this page.

  2. Create a Branch:

    git checkout -b feature/YourFeature
    
  3. Make Your Changes: Edit the code, add features, or fix bugs.

  4. Commit Your Changes:

    git commit -m "Add your message here"
    
  5. Push to the Branch:

    git push origin feature/YourFeature
    
  6. Open a Pull Request: Go to the original repository and click "New Pull Request".

License

This project is licensed under the MIT License. See the LICENSE file for details.

Contact

For any questions or feedback, feel free to reach out:

Releases

You can find the latest releases of the Memory MCP Server here. Please download and execute the appropriate file for your system.

Additional Resources

  • Documentation: More detailed documentation is available in the docs folder.
  • Community: Join our discussion forum to share ideas and get help.

Acknowledgments

Thanks to the contributors and the community for their support. Special thanks to the developers of the libraries that made this project possible.


Topics

This repository covers various topics including:

  • claude
  • cursor
  • cursor-ai
  • cursorai
  • llm
  • llm-memory
  • llms
  • mcp
  • mcp-server
  • model-context-protocol

Feel free to explore these topics further!


Thank you for visiting the Memory MCP Server repository! We hope you find it useful for your projects involving LLMs and long-term memory.

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Playwright McpPlaywright MCP server
Serper MCP ServerA Serper MCP Server
Amap Maps高德地图官方 MCP Server
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
CursorThe AI Code Editor
WindsurfThe new purpose-built IDE to harness magic
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Tavily Mcp
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
ChatWiseThe second fastest AI chatbot™
DeepChatYour AI Partner on Desktop