top | item 47153987

Show HN: Engram – Open-source agent memory that beats Mem0 by 20% on LOCOMO

2 points| tstockham | 6 days ago |engram.fyi

I built Engram because every AI agent I worked with forgot everything between sessions. Existing solutions (Mem0, Zep) are Python-first and extraction-based. They aggressively compress conversations into facts at write time.

Engram takes the opposite approach: store memories with rich metadata and invest intelligence at read time, when you actually know the query. TypeScript, SQLite, zero infrastructure.

Ran the LOCOMO benchmark (same one Mem0 used to claim SOTA):

Engram: 80.0% (10 conversations, 1,540 questions) Mem0 published: 66.9% 93.6% fewer tokens than full-context approaches

Works as an MCP server, REST API, or embedded SDK. Supports Gemini, OpenAI, Ollama, Groq, and any OpenAI-compatible provider.

npm install -g engram-sdk && engram init

https://engram.fyi | https://github.com/tstockham96/engram

discuss

order

No comments yet.