Enables AI agents with persistent memory using SQLite and local LLM models through Ollama integration. Provides chat with context retention and multi-client support across VS Code, Gemini-CLI, and terminal interfaces.
⚠ Local (STDIO) Server
This server runs as a process directly on your machine. It has access to your filesystem, environment variables, and SSH keys. Review the source code before installing.
Tools
0
Indexed
Today
Transport
Local / STDIO
Security Scan
Security scan pending — this server has not yet been analyzed.