- Memento
Memento
What is Memento?
Memento is a scalable knowledge graph memory system designed for large language models (LLMs) that provides persistent ontological memory with advanced relation management and semantic search capabilities.
How to use Memento?
To use Memento, set up a Neo4j database and configure it with the required environment variables. You can then interact with the system through the Model Context Protocol (MCP) to manage entities and relations.
Key features of Memento?
- Scalable knowledge graph with semantic search capabilities.
- Temporal awareness for tracking changes over time.
- Advanced relation management with confidence decay.
- Support for multiple embedding models for semantic search.
Use cases of Memento?
- Enhancing LLMs with persistent memory for improved context retention.
- Semantic search for retrieving related entities based on meaning.
- Temporal analysis of knowledge evolution over time.
FAQ from Memento?
- What is the storage backend for Memento?
Memento uses Neo4j as its storage backend for both graph and vector search capabilities.
- How do I set up Memento?
You can set up Memento by installing Neo4j and configuring the necessary environment variables as outlined in the documentation.
- Can Memento be integrated with other LLMs?
Yes, Memento is designed to work with any LLM client that supports the model context protocol.
Server Config
{
"mcpServers": {
"memento": {
"command": "npx",
"args": [
"-y",
"@gannonh/memento-mcp"
],
"env": {
"MEMORY_STORAGE_TYPE": "neo4j",
"NEO4J_URI": "bolt://127.0.0.1:7687",
"NEO4J_USERNAME": "neo4j",
"NEO4J_PASSWORD": "memento_password",
"NEO4J_DATABASE": "neo4j",
"NEO4J_VECTOR_INDEX": "entity_embeddings",
"NEO4J_VECTOR_DIMENSIONS": "1536",
"NEO4J_SIMILARITY_FUNCTION": "cosine",
"OPENAI_API_KEY": "your-openai-api-key",
"OPENAI_EMBEDDING_MODEL": "text-embedding-3-small"
}
}
}
}