Stop rebuilding context in every agent. AgentMem gives you persistent, searchable memory that syncs across LangChain, CrewAI, OpenAI, Claude, and beyond.
// Initialize AgentMem const agentmem = new AgentMem('your-api-key'); // Store a memory await agentmem.store('User prefers dark mode'); // Search memories with semantic search const results = await agentmem.search('user preferences'); // → [{ content: 'User prefers dark mode', score: 0.89 }] // Sync across agents await agentmem.sync(['agent-1', 'agent-2', 'agent-3']);
Works with your favorite frameworks
Find memories by meaning, not keywords. Powered by vector embeddings.
Share memories between agents. Team context that actually works.
<100ms response times. Built on Fastify with edge caching.
API key auth, tenant isolation, encrypted at rest.
Track memory usage, search patterns, and agent activity.
Deployed on Railway with Cloudflare proxy. Fast everywhere.
Create an agent and get your API key in seconds.
POST /api/agents/register
Store any context your agent learns. Auto-embedded.
POST /api/memories
Retrieve relevant context with semantic search.
POST /api/memories/search
$ curl -X POST https://api.agentmem.dev/api/agents/register \
-H "Content-Type: application/json" \
-d '{"name": "my-agent", "framework": "langchain"}'
{"id": "abc123", "apiKey": "am_live_xxx..."}
$ _
That's it. Start storing memories immediately.