Layer 1 library

kontxt for developers

Open-source AI memory with a local SQLite vault under ~/.kontxt/, tiered embeddings, MCP for IDE hosts, and a CLI. For full reference (flags, host snippets, API tables), see docs.4stax.com .

On this page

Current release v0.1.0.

Capabilities in v0.1.0

  • @4stax/kontxt v0.1.0 on npm: `npx -y @4stax/kontxt init` creates the vault and can auto-configure Cursor and Claude Desktop.
  • Full CLI: add, search, list, edit, delete, capture/extract from transcripts, scan directories, serve/start/stop MCP, status, vacuum/decay.
  • MCP tools for hosts: get_relevant_context, search_memories, list_memories, store_memory, store_conversation_summary, auto_capture, get_user_profile, delete_memory; prompt resource kontxt_context for top memories at session start.
  • Relevance blend: semantic similarity, recency (30-day exponential decay), access frequency (log-scaled), and importance; memories carry embedding_tier so provider switches never mix incompatible vectors.
  • Embeddings: OpenAI text-embedding-3-small (optional key), offline Transformers.js all-MiniLM-L6-v2, Ollama if available, or pseudo keyword fallback.

Release roadmap

  • v0.1 (shipped): Local vault, CLI, MCP server, relevance scoring, auto-capture, Transformers.js offline embeddings.
  • v0.2: Browser extension, cross-provider prompt injection.
  • v0.3: React dashboard, permission controls, audit log.
  • v0.4: Optional encrypted cloud sync.

Install and MCP

npx -y @4stax/kontxt init creates ~/.kontxt/, optionally stores an OpenAI key (--key sk-...), and can write MCP entries for supported hosts. Run kontxt serve (foreground) or kontxt start (daemon) for the MCP host.

@4stax/kontxt init

bash
$ npx -y @4stax/kontxt init

Cursor, Claude Desktop, and other MCP host configs: exact JSON paths and samples are maintained on the documentation site so they stay up to date with releases.

Vault layout

  • ~/.kontxt/
    • vault.db: SQLite (memories, embeddings, scores)
    • config.json: keys and settings
    • models/: Transformers.js cache after first offline embed

Embedding backends

Memories carry an embedding_tier. Search only compares vectors from the same tier so switching providers never mixes incompatible spaces.

01OpenAI

text-embedding-3-small with optional --key. Strong quality, low cost per call.

02Transformers.js

all-MiniLM-L6-v2 offline after first download to ~/.kontxt/models/.

03Ollama

Uses ollama serve with an embed model pulled locally.

04Pseudo

Hashed bag-of-words fallback. Always available; keyword-level only.

MCP tools

Hosts call these tools over MCP. JSON schemas are versioned in the repo and documented in the docs.

get_relevant_context

Semantic search: top-scored memories for the current task.

search_memories

Search with scores and stable memory IDs.

list_memories

List vault contents; optional filters (e.g. by project).

store_memory

Persist a fact or preference from the session.

store_conversation_summary

Extract durable facts from a full transcript.

auto_capture

Like store_conversation_summary with stronger automatic typing.

get_user_profile

Memories grouped by type for quick context.

delete_memory

Remove by id (partial ids supported).

Prompt resource kontxt_context: injects top memories as system context at conversation start.

Relevance scoring

Retrieval blends semantic similarity, recency decay (roughly a 30-day horizon), log-scaled access frequency, and an importance score. Only embeddings from the same tier participate in similarity. Exact weights and formulas are spelled out in the docs.

Memory types

factpreferenceprojectdecisionskillepisodic

Architecture (local plane)

Today: SQLite, kontxt process, MCP stdio. Hosted plane (roadmap): Postgres, object storage, vector index, consent ledger. OSS kontxt remains the local reference.

Local vault
~/.kontxt/vault.db (SQLite), config.json, models/ cache for offline embeddings after first use.
Memory types
fact, preference, project, decision, skill, episodic. Tagged and filterable (e.g. list by project).
Distribution
npm package; MCP via npx @4stax/kontxt serve (foreground) or kontxt start (daemon). Init writes ~/.cursor/mcp.json and Claude Desktop config on macOS.
Platform surfaces
Web experiences today, with a browser extension and hosted API planned.
Hosted plane (roadmap)
Planned: Postgres, vector search, object storage, and a consent ledger for multi-tenant and org features.

Next steps

Install via npm and browse the source on GitHub. For complete reference, use the docs. The Memory product page covers the product overview. Contact hello@4stax.com.