MemGPT
About
Enhance your LLMs with a versatile memory system that supports seamless model transitions and conversation tracking across various providers such as OpenAI, Anthropic, OpenRouter, and Ollama.
Explore Similar MCP Servers
Supermemory
Empower yourself with a versatile personal information management tool that securely gathers, structures, and retrieves data from diverse channels. Benefit from top-notch encryption and the choice to host it yourself.
Mem0 (Long-Term Memory)
Experience advanced memory features with semantic indexing, retrieval, and search functions catering to various LLM providers and leveraging PostgreSQL vector storage. Gain lasting memory solutions for enhanced data organization and accessibility.
Just Prompt (Multi-LLM Provider)
Explore a versatile protocol that streamlines communication with various LLM providers such as OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama. Enjoy efficient prompt sending and secure response file storage features for enhanced productivity.
Mem0
Easily connect to mem0.ai for efficient storage, retrieval, and semantic searching of coding preferences, code snippets, and programming expertise via a reliable FastMCP server supported by SSE connections.
Consult LLM
Enhance intricate reasoning assignments using advanced language models like OpenAI o3, Google Gemini 2.5 Pro, and DeepSeek Reasoner. Share markdown cues along with code environment details and git differentials to receive comprehensive responses including precise cost analysis.
OpenRouter
Experience seamless integration of various AI models through the OpenRouter platform, enhancing adaptability in selecting and utilizing models for a wide range of applications such as chatbots and content creation.
Ollama
Facilitate quick and secure access to Bridges Ollama's large language model with the Model Context Protocol (MCP). Ensure data privacy and control while running local instances for optimal performance.
LLM Gateway
Streamline the coordination of various LLM providers through a single gateway. Automatically choose models, employ semantic caching, and optimize costs for dependable production implementations.
Memory Plus
Manage memories efficiently with a compact and local RAG memory storage solution designed for MCP agents. Seamlessly store, access, modify, remove, and illustrate long-lasting data between sessions.
OpenAI MCP
Enhance connectivity between OpenAI and Anthropic models through prompt customization, response streaming, and effective caching, ideal for applications that demand tailored Large Language Model (LLM) usage.
Letta
Employs the Letta API for seamless incorporation and control of agents, memory sections, and utilities to enhance sophisticated AI-driven engagements and memory administration.
Ollama
Empower your on-site AI capabilities with seamless integration of Ollama's local LLM models into MCP-compatible software. Unlock the potential for customized model deployment and in-house data management, ensuring enhanced control over AI processing.
Memento
Unlock advanced persistent memory features with a cutting-edge SQLite-powered knowledge graph. This innovative system securely preserves entities, observations, and connections. Benefit from seamless full-text and semantic search functionalities, enhanced by BGE-M3 embeddings. Seamlessly retrieve intelligent contexts from discussions with ease.
OpenRouter
Unlock a wide array of AI models from OpenRouter for seamless integration, allowing for rich interactions that combine vision and language capabilities. Benefit from smart model selection, efficient caching, and reliable error management for a smoother user experience.
Mindbridge
Connects various leading LLM providers such as OpenAI, Anthropic, Google, DeepSeek, OpenRouter, and Ollama via a consolidated interface. This allows for analyzing responses, comparing results, and harnessing unique reasoning strengths from diverse models.