Persistent memory layer for AI coding sessions. Context, decisions, and patterns survive across every conversation.
$ remb init
✔ Project "my-app" registered
✔ GitHub connected, scanning 132 files...
✔ 24 features extracted, 8 core memories saved
$ remb context
→ 8 core memories loaded
→ 3 recent conversations restored
→ Architecture: Next.js + tRPC + Prisma
$ AI now has full project context
The Problem
AIassistantsloseallcontextbetweenconversations.Yourepeatyourself,re-explainarchitecture,andwatchyourAImakethesamemistakesyoualreadycorrected.Hoursofsharedcontext—gonewitheverynewwindow.
Repeated Explanations
"This project uses App Router with server actions..." — typed for the 50th time.
Lost Decisions
You spent hours on a state management approach. New session? AI suggests the one you rejected.
No Project Awareness
AI doesn’t know your folder structure, naming conventions, or dependencies.
Broken Continuity
Yesterday you built auth together. Today: "What authentication approach would you like to use?"
Features
Memories in three layers — Core loads every session, Active surfaces on-demand, Archive stores long-term.
5-phase pipeline: Scout, Analyze, Architect, Review, Finalize. Extracts features, code symbols, architecture layers, and dependency graphs from your entire codebase.
Every session is logged — what was discussed, built, decided. Next session starts with full history.
Search memories across all projects. Say "do it like in project X" and your AI pulls matching patterns.
Queryable graph of every function, class, and component. Architecture layers auto-detected. Trace call chains, imports, and data flows.
Chat with your codebase. The AI assembles relevant memories, code symbols, and conversation history into every response — 15 tools at its disposal.
AI creates phased development plans. Track phases, auto-complete plans when all phases finish. Visible in chat and dashboard.
Credentials stored with chmod 600. OAuth PKCE for login. Scoped tokens per project.
Visual project explorer, feature graph, memory manager, conversation history browser, MCP hub for connecting external AI tools.
How It Works
Install the CLI, then run init — it auto-detects your IDE, offers to sign in, registers the project on Remb, and injects AI context into the right config files.
$ curl -fsSL https://www.useremb.com/install.sh | sh
$ remb init
ℹ Detected IDE: VS Code (GitHub Copilot)
Sign in now? [Y/n]: Y
✔ Authenticated as samie105!
✔ Project my-app initialized!
ℹ AI context injected into: .github/copilot-instructions.md
Scan your codebase with Remb. It analyzes every file — features, patterns, dependencies. Save decisions as persistent memories.
$ remb scan
✔ Scanning 247 files across 12 directories...
✔ Extracted 31 features, 5 service boundaries
$ remb save -f auth -c "Using PKCE OAuth with refresh rotation"
✔ Saved to auth (core tier)
When your AI starts a new conversation, Remb’s MCP server automatically injects project context, conversation history, and relevant memories.
$ # AI automatically calls on session start:
→ remb__memory_load_context()
Loading 8 core memories...
Loading 3 recent conversations...
Loading feature map (31 features)...
→ AI is now fully context-aware
Model Context Protocol
MCPistheopenprotocolthatletsAIassistantsconnecttoexternaltools.Rembexposes42toolsasafirst-classMCPserver—meaningClaude,Cursor,Windsurf,VSCodeCopilot,andanyMCPclientcanaccessyourprojectmemory,codegraph,plans,andscanningpipelinenatively.
42 MCP Tools
Memory, conversations, plans, code graph, scanning, cross-project search, architecture analysis — all as autonomous tool calls.
Remote HTTP Server
Connect via useremb.com — no local binary needed. Add the URL to your MCP config and your AI has instant access.
Local stdio Mode
Run remb serve for a local MCP server. Works offline with any client that supports stdio transport.
Auto-Session Protocol
On session start, Remb automatically loads project context, conversation history, and architecture layers.
Remote (HTTP)
{
"mcpServers": {
"remb": {
"type": "http",
"url": "https://www.useremb.com/api/mcp"
}
}
}Local stdio (offline)
{
"mcpServers": {
"remb": {
"command": "remb",
"args": ["serve"]
}
}
}Command Line
Written in Go for instant startup. Ships as a single binary with zero runtime dependencies. Also available as a Node.js CLI via npm.
remb initInitialize project — detects IDE, injects AI context, registers on Remb.
remb loginAuthenticate via browser OAuth or API key.
remb scanTrigger a cloud scan — checks git status, extracts features, updates context. Recommended over push.
remb pushDeprecated alias for remb scan — kept for backward compatibility.
remb saveSave a context entry for a specific feature or module.
remb getRetrieve context entries with filtering by feature.
remb serveStart the MCP server over stdio for AI tool integration.
remb contextLoad full project context bundle — memories, features, tech stack.
remb memoryCreate, list, search, promote, or delete persistent memories.
$ remb scan --path src/auth --depth 3
Scanning src/auth (depth: 3)...
→ Found 8 files, 3 feature boundaries
→ auth-provider.tsx → Authentication
→ session.ts → Session Management
→ middleware.ts → Auth Middleware
✔ 3 features updated, 12 context entries saved
$ remb get -f auth --format table
┌──────────────────┬─────────────────────────────┐
│ Feature │ Content │
├──────────────────┼─────────────────────────────┤
│ Authentication │ PKCE OAuth + refresh tokens │
│ Session Mgmt │ Server-side with httpOnly │
│ Auth Middleware │ Edge middleware, JWT verify │
└──────────────────┴─────────────────────────────┘
$ remb link --from auth --to session --type depends_on
✔ Linked auth → session (depends_on)
Get Started
Zero dependencies, Go binary
curl -fsSL https://www.useremb.com/install.sh | shNode.js CLI
npm install -g remb-climacOS & Linux
brew tap samie105/remb && brew install rembExtension marketplace
ext install remb.rembFrom zero to full AI context in 4 steps.
Install
Go binary is fastest — single binary, no runtime.
curl -fsSL https://www.useremb.com/install.sh | sh
Initialize
Auto-detects IDE, offers sign-in, registers project, injects AI context.
remb init
Scan
Trigger a cloud scan to extract features from your codebase.
remb scan
Connect your AI
Add Remb as an MCP server. Context injection is automatic.
{
"mcpServers": {
"remb": {
"type": "http",
"url": "https://www.useremb.com/api/mcp"
}
}
}Set up once. Every AI conversation from here forwards starts with full project context.