ShackleAI Memory

ShackleAIv0.5.0MIT

by ShackleAI · Memory

1,416 weekly downloads

Install to ToolCloud

One click. No local setup. Works with any MCP client.

Install to ToolCloud

Persistent memory for AI coding tools. Give any MCP-compatible AI tool memory that persists across sessions — decisions, conventions, bugs, architecture, and context stored as human-readable Markdown with semantic search.

claude-code-hookscontextlocal-firstmarkdownmcp-registrymemoryofflinepersistencesemantic-searchsqliteuniversal-setup

Tools Provided

MCP tools exposed by this server

  • One-command setup — npx setup configures your AI tool automatically
  • 11 MCP tools — store, search, update, delete, TODO tracking, export/import, cleanup
  • Universal client support — auto-generates rules for Claude Code, Cursor, Windsurf, VS Code Copilot, Cline
  • Claude Code hooks — automatic reminders to store memories and save session summaries
  • Semantic search — find memories by meaning, not just keywords
  • Local-first — everything at ~/.shackleai/, no cloud dependency
  • Zero config — no API keys, no accounts, no setup beyond one command
  • Offline — local embeddings via MiniLM-L6-v2 (free, runs on CPU)
  • Human-readable — Markdown files you can read, edit, and version control
  • Smart deduplication — automatically merges duplicate memories (0.85 threshold)
  • TODO tracking — track pending/in-progress/done across sessions
  • Multi-project — separate memory spaces per project, auto-detected
  • Auto-update — @latest tag always fetches newest version

About

ShackleAI Memory is the first MCP-native memory server. It gives AI coding tools like Claude Code, Cursor, Windsurf, VS Code Copilot, and OpenAI Codex persistent memory across sessions.

Your AI remembers decisions, conventions, bugs, and context — picks up exactly where you left off. No cloud account, no API keys, no configuration beyond one install command.

How It Works

  1. Run the setup command (one-time)
  2. Start your AI tool in the project directory
  3. Memory server starts automatically in the background
  4. AI stores decisions, conventions, and bugs as you work
  5. Next session — AI searches memory and picks up where you left off

You don't need to do anything after setup. The AI sees the memory tools and uses them proactively — storing important decisions, searching for past context, and saving session summaries.

Storage

All data lives locally on your machine at ~/.shackleai/. Memories are stored as both SQLite (for semantic search) and Markdown files (human-readable, git-friendly). The embedding model downloads once (~80MB) on first run, then everything works fully offline.

Why ShackleAI Memory?

  • No vendor lock-in — MIT licensed, all data on your machine
  • Works with any MCP client — not tied to one AI tool
  • Team-friendly — commit .mcp.json and CLAUDE.md to git so your whole team gets memory
  • LLM-portable — switch AI tools anytime, your memory stays
Manual Installation

Quick Setup

Run this in your project directory:

npx -y @shackleai/memory-mcp@latest setup

Claude Code

claude mcp add shackleai-memory -- npx -y @shackleai/memory-mcp@latest

Cursor / Windsurf / VS Code Copilot / Claude Desktop

Add to your MCP config file:

{
  "mcpServers": {
    "memory": {
      "args": [
        "-y",
        "@shackleai/memory-mcp@latest"
      ],
      "command": "npx"
    }
  }
}
Requirements: Node.js 20+ (for npx). First run downloads embedding model (~80MB, one-time). Works offline after that.