Skip to content

MCP Server Setup — All Editors & Clients

Copy-paste the config for your editor and you're done. No nmem init needed — the server auto-initializes on first use.


Table of Contents


Requirements

  • Python 3.11+
  • pip or uv package manager
# Install via pip
pip install neural-memory

# Or via uv (faster)
uv pip install neural-memory

Note: If using uvx (recommended for Claude Code), you don't need to install manually — uvx handles it automatically.


The easiest way. One command installs everything:

/plugin marketplace add nhadaututtheky/neural-memory
/plugin install neural-memory@neural-memory-marketplace

This auto-configures the MCP server, skills, commands, agent, and hooks.

Done. No further setup needed.


The official way to add MCP servers to Claude Code:

# Global (all projects):
claude mcp add --scope user neural-memory -- nmem-mcp

# Or with uvx (no pip install needed):
claude mcp add --scope user neural-memory -- uvx --from neural-memory nmem-mcp

# Project-only:
claude mcp add neural-memory -- nmem-mcp

Alternatively, add to your project's .mcp.json manually:

{
  "mcpServers": {
    "neural-memory": {
      "command": "uvx",
      "args": ["--from", "neural-memory", "nmem-mcp"]
    }
  }
}

Or if you installed via pip (no uvx):

{
  "mcpServers": {
    "neural-memory": {
      "command": "nmem-mcp"
    }
  }
}

Note: Do NOT add MCP servers to ~/.claude/settings.json or ~/.claude/mcp_servers.json — Claude Code does not read MCP config from those files. Use claude mcp add or .mcp.json.


Cursor

Add to ~/.cursor/mcp.json (global) or .cursor/mcp.json (project):

{
  "mcpServers": {
    "neural-memory": {
      "command": "nmem-mcp"
    }
  }
}

With uvx (no pip install needed):

{
  "mcpServers": {
    "neural-memory": {
      "command": "uvx",
      "args": ["--from", "neural-memory", "nmem-mcp"]
    }
  }
}

Restart Cursor after adding the config.


Windsurf (Codeium)

Add to ~/.codeium/windsurf/mcp_config.json:

{
  "mcpServers": {
    "neural-memory": {
      "command": "nmem-mcp"
    }
  }
}

With uvx:

{
  "mcpServers": {
    "neural-memory": {
      "command": "uvx",
      "args": ["--from", "neural-memory", "nmem-mcp"]
    }
  }
}

Restart Windsurf after adding.


VS Code

With Continue Extension

Add to ~/.continue/config.json under mcpServers:

{
  "mcpServers": [
    {
      "name": "neural-memory",
      "command": "nmem-mcp"
    }
  ]
}

With Copilot Chat (MCP support)

Add to VS Code settings.json:

{
  "mcp": {
    "servers": {
      "neural-memory": {
        "command": "nmem-mcp"
      }
    }
  }
}

VS Code Extension (GUI)

For a graphical experience, install the NeuralMemory VS Code Extension from the marketplace.


Claude Desktop

Add to claude_desktop_config.json:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "neural-memory": {
      "command": "nmem-mcp"
    }
  }
}

With uvx:

{
  "mcpServers": {
    "neural-memory": {
      "command": "uvx",
      "args": ["--from", "neural-memory", "nmem-mcp"]
    }
  }
}

Windows — full path (if nmem-mcp not in PATH):

{
  "mcpServers": {
    "neural-memory": {
      "command": "python",
      "args": ["-m", "neural_memory.mcp"]
    }
  }
}

Restart Claude Desktop after adding.


Cline

Add to Cline MCP settings (cline_mcp_settings.json in your VS Code workspace):

{
  "mcpServers": {
    "neural-memory": {
      "command": "nmem-mcp",
      "disabled": false
    }
  }
}

With uvx:

{
  "mcpServers": {
    "neural-memory": {
      "command": "uvx",
      "args": ["--from", "neural-memory", "nmem-mcp"],
      "disabled": false
    }
  }
}

Zed

Add to Zed settings.json (~/.config/zed/settings.json):

{
  "language_models": {
    "mcp_servers": {
      "neural-memory": {
        "command": "nmem-mcp"
      }
    }
  }
}

Google Antigravity

Google's AI-powered editor with built-in MCP Store.

Option 1: MCP Store (GUI)

  1. Open the MCP Store via the ... dropdown at the top of the editor's agent panel
  2. Browse & install servers directly
  3. Authenticate if prompted

Option 2: Custom Config (for NeuralMemory)

  1. Open MCP Store → click "Manage MCP Servers"
  2. Click "View raw config"
  3. Add NeuralMemory to mcp_config.json:
{
  "mcpServers": {
    "neural-memory": {
      "command": "nmem-mcp"
    }
  }
}

With uvx:

{
  "mcpServers": {
    "neural-memory": {
      "command": "uvx",
      "args": ["--from", "neural-memory", "nmem-mcp"]
    }
  }
}
  1. Save and restart the editor.

Tip: Antigravity also supports connecting to NeuralMemory's FastAPI server mode. Run nmem serve and connect via HTTP if you prefer server-side integration.


JetBrains IDEs (IntelliJ, PyCharm, WebStorm)

JetBrains IDEs support MCP via the built-in AI Assistant or the JetBrains AI plugin.

Go to Settings → Tools → AI Assistant → MCP Servers → Add, or edit the config file directly:

  • Location: .idea/mcpServers.json (project) or global settings
{
  "mcpServers": {
    "neural-memory": {
      "command": "nmem-mcp"
    }
  }
}

With uvx:

{
  "mcpServers": {
    "neural-memory": {
      "command": "uvx",
      "args": ["--from", "neural-memory", "nmem-mcp"]
    }
  }
}

Restart the IDE after adding.


Gemini CLI

Add to ~/.gemini/settings.json:

{
  "mcpServers": {
    "neural-memory": {
      "command": "nmem-mcp"
    }
  }
}

With uvx:

{
  "mcpServers": {
    "neural-memory": {
      "command": "uvx",
      "args": ["--from", "neural-memory", "nmem-mcp"]
    }
  }
}

Amazon Q Developer

Add to ~/.aws/amazonq/mcp.json:

{
  "mcpServers": {
    "neural-memory": {
      "command": "nmem-mcp"
    }
  }
}

With uvx:

{
  "mcpServers": {
    "neural-memory": {
      "command": "uvx",
      "args": ["--from", "neural-memory", "nmem-mcp"]
    }
  }
}

Neovim

With mcp-hub.nvim or similar MCP plugin, add to your mcpservers.json:

{
  "mcpServers": {
    "neural-memory": {
      "command": "nmem-mcp"
    }
  }
}

Or configure in Lua:

require("mcphub").setup({
  servers = {
    ["neural-memory"] = {
      command = "nmem-mcp",
    },
  },
})

Warp Terminal

Add to Warp's MCP config (~/.warp/mcp.json):

{
  "mcpServers": {
    "neural-memory": {
      "command": "nmem-mcp"
    }
  }
}

Custom / Other MCP Clients

NeuralMemory uses stdio transport (JSON-RPC 2.0 over stdin/stdout). Any MCP-compatible client can connect:

{
  "name": "neural-memory",
  "transport": "stdio",
  "command": "nmem-mcp"
}

Or with explicit Python:

{
  "name": "neural-memory",
  "transport": "stdio",
  "command": "python",
  "args": ["-m", "neural_memory.mcp"]
}

Alternative: Python Module Directly

If nmem-mcp is not in your PATH, use the Python module:

{
  "neural-memory": {
    "command": "python",
    "args": ["-m", "neural_memory.mcp"]
  }
}

macOS/Linux with specific Python:

{
  "neural-memory": {
    "command": "python3",
    "args": ["-m", "neural_memory.mcp"]
  }
}

Windows with full path:

{
  "neural-memory": {
    "command": "C:\\Users\\YOU\\AppData\\Local\\Programs\\Python\\Python312\\python.exe",
    "args": ["-m", "neural_memory.mcp"]
  }
}

Alternative: Docker

docker run -i --rm -v neuralmemory:/root/.neuralmemory ghcr.io/nhadaututtheky/neural-memory:latest nmem-mcp
{
  "neural-memory": {
    "command": "docker",
    "args": [
      "run", "-i", "--rm",
      "-v", "neuralmemory:/root/.neuralmemory",
      "ghcr.io/nhadaututtheky/neural-memory:latest",
      "nmem-mcp"
    ]
  }
}

Environment Variables

Variable Default Description
NEURALMEMORY_BRAIN "default" Brain name to use
NEURALMEMORY_DATA_DIR ~/.neuralmemory Data directory
NEURAL_MEMORY_DEBUG 0 Enable debug logging (1 to enable)
MEM0_API_KEY Mem0 API key (for import)
COGNEE_API_KEY Cognee API key (for import)

Example with custom brain:

{
  "neural-memory": {
    "command": "nmem-mcp",
    "env": {
      "NEURALMEMORY_BRAIN": "work"
    }
  }
}

Resource Usage

Metric Value
RAM (idle) ~12-15 MB
RAM (active, small brain) ~30-35 MB
RAM (active, large brain) ~55-60 MB
CPU Near 0% when idle
Disk ~1-50 MB per brain (SQLite)
Startup time < 2 seconds

NeuralMemory is lightweight — it won't slow down your editor.


Available Tools

Once configured, these 38 tools are available to your AI assistant:

Core Memory

Tool Description
nmem_remember Store a memory (fact, decision, insight, todo, error, etc.)
nmem_recall Query with spreading activation (depth 0-3)
nmem_context Inject recent context at session start
nmem_todo Quick TODO with 30-day expiry
nmem_auto Auto-capture memories from conversation text
nmem_suggest Autocomplete suggestions from brain neurons
nmem_edit Edit memory type, content, or priority by fiber ID
nmem_forget Soft delete (set expiry) or hard delete (permanent removal)

Brain Management

Tool Description
nmem_stats Brain statistics and freshness
nmem_health Brain health diagnostics (purity score, grade, top penalties)
nmem_evolution Brain evolution metrics (maturation, plasticity)
nmem_version Brain version control (snapshot, rollback, diff)
nmem_transplant Copy memories between brains
nmem_conflicts View and resolve memory conflicts
nmem_alerts Brain health alerts lifecycle

Session & Context

Tool Description
nmem_session Track working session state and progress
nmem_eternal Save project context, decisions, instructions
nmem_recap Load saved context at session start

Learning & Training

Tool Description
nmem_index Index codebase for code-aware recall
nmem_train Train brain from documentation files
nmem_train_db Train brain from database schema
nmem_habits Manage learned workflow habits
nmem_review Spaced repetition review system
nmem_pin Pin/unpin memories (pinned = permanent, skip decay/prune)

Cognitive Reasoning

Tool Description
nmem_hypothesize Create and manage hypotheses with Bayesian confidence tracking
nmem_evidence Submit evidence for/against — auto-updates confidence
nmem_predict Falsifiable predictions with deadlines, linked to hypotheses
nmem_verify Verify predictions correct/wrong — propagates to hypotheses
nmem_cognitive Hot index: ranked summary of active hypotheses + predictions
nmem_gaps Knowledge gap metacognition: detect, track, resolve
nmem_schema Schema evolution: evolve hypotheses via SUPERSEDES chain
nmem_explain Trace shortest path between two concepts

Utilities

Tool Description
nmem_import Import from ChromaDB, Mem0, Cognee, Graphiti, LlamaIndex
nmem_narrative Generate timeline/topic/causal narratives
nmem_telegram_backup Send brain .db backup to Telegram

Sync (Multi-Device)

Tool Description
nmem_sync Trigger manual sync (push/pull/full)
nmem_sync_status Show pending changes, devices, last sync
nmem_sync_config Configure hub URL, auto-sync, conflict strategy

Tool Tiers

By default all 38 tools are exposed on every API turn. If you want to reduce token overhead, configure a tool tier in ~/.neuralmemory/config.toml:

[tool_tier]
tier = "standard"   # minimal | standard | full

Or via CLI:

nmem config tier --show       # show current tier
nmem config tier standard     # set to standard
nmem config tier full         # reset to full
Tier Tools Est. Tokens Savings
full (default) 26 ~3,800
standard 8 ~1,400 ~63%
minimal 4 ~700 ~82%

Tier contents:

  • minimalremember, recall, context, recap
  • standard — minimal + todo, session, auto, eternal
  • full — all 38 tools

Hidden tools remain callable — only the schema listing changes. If the AI model already knows a tool name, it can still call it even when the tool is not exposed in tools/list.


Resources

The MCP server provides resources for system prompts:

Resource URI Description
neuralmemory://prompt/system Full system prompt for AI assistants
neuralmemory://prompt/compact Compact version for token-limited contexts

Get MCP Config via CLI

nmem mcp-config

View System Prompt via CLI

nmem prompt            # Full prompt
nmem prompt --compact  # Compact version
nmem prompt --json     # As JSON

Agent Instructions

Copy these instructions into your project's CLAUDE.md (for Claude Code) or .cursorrules (for Cursor) to teach your AI assistant how to use NeuralMemory proactively.

For Claude Code

See docs/agent-instructions/CLAUDE.md for the full template.

For Cursor

See docs/agent-instructions/.cursorrules for the full template.

Quick Version (any editor)

## Memory System — NeuralMemory

This workspace uses NeuralMemory for persistent memory.
Use nmem_* MCP tools PROACTIVELY.

### Session Start (ALWAYS)
1. nmem_recap() — Resume context
2. nmem_context(limit=20, fresh_only=true) — Recent memories
3. nmem_session(action="get") — Current task

### Auto-Remember
- Decision made → nmem_remember(content="...", type="decision", priority=7)
- Bug fixed → nmem_remember(content="...", type="error", priority=7)
- TODO found → nmem_todo(task="...", priority=6)

### Auto-Recall
Before asking user → nmem_recall(query="<topic>", depth=1)

### Session End
nmem_auto(action="process", text="<session summary>")
nmem_session(action="set", feature="...", progress=0.8)

Troubleshooting

"nmem-mcp" not found

# Check if installed
pip show neural-memory

# Check if nmem-mcp is in PATH
which nmem-mcp    # macOS/Linux
where nmem-mcp    # Windows

# If not found, use Python module instead
python -m neural_memory.mcp

Tools not appearing in editor

  1. Verify the MCP config file path is correct for your editor
  2. Restart the editor completely
  3. Check editor logs for MCP connection errors
  4. Test manually: echo '{"jsonrpc":"2.0","method":"tools/list","id":1}' | nmem-mcp

Python version mismatch

# NeuralMemory requires Python 3.11+
python --version

# If you have multiple Python versions, specify the full path

Windows: encoding errors

NeuralMemory handles Windows stdio encoding automatically. If you still see encoding issues:

{
  "neural-memory": {
    "command": "python",
    "args": ["-m", "neural_memory.mcp"],
    "env": {
      "PYTHONIOENCODING": "utf-8"
    }
  }
}

Permission denied (macOS/Linux)

chmod +x $(which nmem-mcp)

uvx not found

# Install uv first
pip install uv

# Or use pipx
pipx install neural-memory

Debug mode

# Run with debug logging
NEURAL_MEMORY_DEBUG=1 nmem-mcp

Reset to fresh state

# macOS/Linux
rm -rf ~/.neuralmemory

# Windows
rmdir /s /q %USERPROFILE%\.neuralmemory

Quick Reference

Editor Config File Config Format
Claude Code claude mcp add or .mcp.json { "mcpServers": { ... } }
Cursor ~/.cursor/mcp.json { "mcpServers": { ... } }
Windsurf ~/.codeium/windsurf/mcp_config.json { "mcpServers": { ... } }
Claude Desktop See path above { "mcpServers": { ... } }
VS Code (Continue) ~/.continue/config.json { "mcpServers": [ ... ] }
VS Code (Copilot) VS Code settings.json { "mcp": { "servers": { ... } } }
Cline cline_mcp_settings.json { "mcpServers": { ... } }
Zed ~/.config/zed/settings.json { "language_models": { "mcp_servers": { ... } } }
Antigravity mcp_config.json (via MCP Store) { "mcpServers": { ... } }
JetBrains .idea/mcpServers.json { "mcpServers": { ... } }
Gemini CLI ~/.gemini/settings.json { "mcpServers": { ... } }
Amazon Q ~/.aws/amazonq/mcp.json { "mcpServers": { ... } }
Neovim mcpservers.json (plugin-dependent) { "mcpServers": { ... } }
Warp ~/.warp/mcp.json { "mcpServers": { ... } }

Minimum config for any editor:

{
  "neural-memory": {
    "command": "nmem-mcp"
  }
}

That's it. Copy, paste, restart. Done.