Available on npm, VS Code & Homebrew

Your AI never
forgets again

Persistent memory layer for AI coding sessions. Context, decisions, and patterns survive across every conversation.

Terminal

$ remb init

Project "my-app" registered

GitHub connected, scanning 132 files...

24 features extracted, 8 core memories saved

$ remb context

8 core memories loaded

3 recent conversations restored

Architecture: Next.js + tRPC + Prisma

$ AI now has full project context

The Problem

Every chat starts from zero

AIassistantsloseallcontextbetweenconversations.Yourepeatyourself,re-explainarchitecture,andwatchyourAImakethesamemistakesyoualreadycorrected.Hoursofsharedcontextgonewitheverynewwindow.

Repeated Explanations

"This project uses App Router with server actions..." — typed for the 50th time.

Lost Decisions

You spent hours on a state management approach. New session? AI suggests the one you rejected.

No Project Awareness

AI doesn’t know your folder structure, naming conventions, or dependencies.

Broken Continuity

Yesterday you built auth together. Today: "What authentication approach would you like to use?"

Features

Everything your AI needs to remember

Core

Tiered Persistent Memory

Memories in three layers — Core loads every session, Active surfaces on-demand, Archive stores long-term.

coreactivearchive

Multi-Agent Scanning

5-phase pipeline: Scout, Analyze, Architect, Review, Finalize. Extracts features, code symbols, architecture layers, and dependency graphs from your entire codebase.

Conversation Continuity

Every session is logged — what was discussed, built, decided. Next session starts with full history.

Multi-project

Cross-Project Intelligence

Search memories across all projects. Say "do it like in project X" and your AI pulls matching patterns.

Code Graph & Architecture

Queryable graph of every function, class, and component. Architecture layers auto-detected. Trace call chains, imports, and data flows.

AI Chat with Full Context

Chat with your codebase. The AI assembles relevant memories, code symbols, and conversation history into every response — 15 tools at its disposal.

Development Plans

AI creates phased development plans. Track phases, auto-complete plans when all phases finish. Visible in chat and dashboard.

Secure by Default

Credentials stored with chmod 600. OAuth PKCE for login. Scoped tokens per project.

Web Dashboard

Visual project explorer, feature graph, memory manager, conversation history browser, MCP hub for connecting external AI tools.

How It Works

Three steps to permanent context

01

Install & Init

Install the CLI, then run init — it auto-detects your IDE, offers to sign in, registers the project on Remb, and injects AI context into the right config files.

Terminal

$ curl -fsSL https://www.useremb.com/install.sh | sh

$ remb init

ℹ Detected IDE: VS Code (GitHub Copilot)

Sign in now? [Y/n]: Y

✔ Authenticated as samie105!

✔ Project my-app initialized!

ℹ AI context injected into: .github/copilot-instructions.md

02

Scan & Remember

Scan your codebase with Remb. It analyzes every file — features, patterns, dependencies. Save decisions as persistent memories.

Terminal

$ remb scan

✔ Scanning 247 files across 12 directories...

✔ Extracted 31 features, 5 service boundaries

$ remb save -f auth -c "Using PKCE OAuth with refresh rotation"

✔ Saved to auth (core tier)

03

Auto-Load Every Session

When your AI starts a new conversation, Remb’s MCP server automatically injects project context, conversation history, and relevant memories.

Terminal

$ # AI automatically calls on session start:

→ remb__memory_load_context()

Loading 8 core memories...

Loading 3 recent conversations...

Loading feature map (31 features)...

→ AI is now fully context-aware

Model Context Protocol

Native MCP integration

MCPistheopenprotocolthatletsAIassistantsconnecttoexternaltools.Rembexposes42toolsasafirst-classMCPservermeaningClaude,Cursor,Windsurf,VSCodeCopilot,andanyMCPclientcanaccessyourprojectmemory,codegraph,plans,andscanningpipelinenatively.

42 MCP Tools

Memory, conversations, plans, code graph, scanning, cross-project search, architecture analysis — all as autonomous tool calls.

Remote HTTP Server

Connect via useremb.com — no local binary needed. Add the URL to your MCP config and your AI has instant access.

Local stdio Mode

Run remb serve for a local MCP server. Works offline with any client that supports stdio transport.

Auto-Session Protocol

On session start, Remb automatically loads project context, conversation history, and architecture layers.

Remote (HTTP)

mcp config
{
  "mcpServers": {
    "remb": {
      "type": "http",
      "url": "https://www.useremb.com/api/mcp"
    }
  }
}

Local stdio (offline)

mcp config
{
  "mcpServers": {
    "remb": {
      "command": "remb",
      "args": ["serve"]
    }
  }
}
Claude DesktopCursorVS Code CopilotWindsurfZedNeovim

Command Line

Full-featured CLI

Written in Go for instant startup. Ships as a single binary with zero runtime dependencies. Also available as a Node.js CLI via npm.

remb init

Initialize project — detects IDE, injects AI context, registers on Remb.

remb login

Authenticate via browser OAuth or API key.

remb scan

Trigger a cloud scan — checks git status, extracts features, updates context. Recommended over push.

remb push

Deprecated alias for remb scan — kept for backward compatibility.

remb save

Save a context entry for a specific feature or module.

remb get

Retrieve context entries with filtering by feature.

remb serve

Start the MCP server over stdio for AI tool integration.

remb context

Load full project context bundle — memories, features, tech stack.

remb memory

Create, list, search, promote, or delete persistent memories.

Terminal \u2014 remb

$ remb scan --path src/auth --depth 3

Scanning src/auth (depth: 3)...

Found 8 files, 3 feature boundaries

auth-provider.tsx Authentication

session.ts Session Management

middleware.ts Auth Middleware

3 features updated, 12 context entries saved

$ remb get -f auth --format table

───────────────────────────────────────────────

Feature         Content                

───────────────────────────────────────────────

Authentication    PKCE OAuth + refresh tokens 

Session Mgmt      Server-side with httpOnly   

Auth Middleware    Edge middleware, JWT verify 

───────────────────────────────────────────────

$ remb link --from auth --to session --type depends_on

Linked auth session (depends_on)

Get Started

Install in seconds

curl

Zero dependencies, Go binary

curl -fsSL https://www.useremb.com/install.sh | sh
npm

Node.js CLI

npm install -g remb-cli
Homebrew

macOS & Linux

brew tap samie105/remb && brew install remb
VS Code

Extension marketplace

ext install remb.remb

Quick Start

From zero to full AI context in 4 steps.

1

Install

Go binary is fastest — single binary, no runtime.

curl -fsSL https://www.useremb.com/install.sh | sh
2

Initialize

Auto-detects IDE, offers sign-in, registers project, injects AI context.

remb init
3

Scan

Trigger a cloud scan to extract features from your codebase.

remb scan
4

Connect your AI

Add Remb as an MCP server. Context injection is automatic.

{
  "mcpServers": {
    "remb": {
      "type": "http",
      "url": "https://www.useremb.com/api/mcp"
    }
  }
}

Stop repeating yourself to your AI

Set up once. Every AI conversation from here forwards starts with full project context.

Star on GitHub