97 / 749

Show HN: Vektor – local-first associative memory for AI agents

TL;DR

Vektor is a local-first memory system for AI agents – no cloud, all data stored via SQLite on-device.

Key Points

  • Its core is a MAGMA graph with four memory layers that maps associative links between stored memories.
  • The AUDN curation loop automatically decides for each new input: add, update, delete, or no-op.
  • A background REM process compresses and consolidates memory contents, analogous to sleep consolidation in humans.
  • Installable via npm (vektor-slipstream) with built-in Claude tool support; version 1.3.6 is approaching final release.

Nauti's Take

The combination of MAGMA graph, AUDN loop, and REM compression suggests a thoughtfully designed system architecture – and it is refreshing to see an agent tool that treats privacy by design as a real constraint rather than a marketing claim. That said, a one-point HN post with zero comments is not proof of production-readiness.

The open call for DB testing expertise signals the product is still pre-stable, which is honestly communicated but warrants caution before dropping it into critical agent pipelines. Anyone building persistent, off-cloud memory for AI agents should keep Vektor on the radar – just wait for a solid post-1.3.

6 review before committing.

Context

Agent memory remains one of the unsolved core problems in the AI stack – most systems lose context after each session or offload everything to a third-party database. Vektor tackles both pain points simultaneously: local data sovereignty plus structured, associative forgetting rather than raw token accumulation. The AUDN loop is conceptually interesting because it actively reduces redundancy instead of just flooding the context buffer.

Whether the architecture holds up in real workloads is something benchmarks will need to confirm.

Sources