Your agents share the same memory.
Correct something in Claude Code. Cursor already knows. OpenClaw remembers it tomorrow. Runs on your machine. No cloud, no account, no lock-in.
FREE · LOCAL · PRIVATE · YOURS
Scroll
The problem

You corrected your agent's coding style last Tuesday. Today it made the same mistake. You explained your project's architecture to Cursor this morning. Tonight, Claude Code has no idea. Every conversation starts from zero.

PLUR gives your agents a shared memory that persists, learns from feedback, and travels between tools and devices. Corrections stick. Preferences compound. Models are commodities — your knowledge isn't. PLUR makes it portable.

Proven, not promised
Proven. 89% win rate.
We gave the same task to Claude — with and without memory. Without it, your agent got house rules right 10–38% of the time. With PLUR: 100%. Every model. Every run.
89% Win rate
100% House rules
0 Penalty
Knowledge type Record Win rate
House rules Tag formats, file placement, project conventions 12 – 0 100%
Tool routing Finding the right tool among 100+ options 10 – 2 83%
Past experience API quirks, debugging insights, infrastructure details 4 – 0 100%
Learned style Communication tone, design preferences 5 – 2 71%
General tasks No memory advantage expected No penalty
COST
Haiku + PLUR outperforms Opus alone — 2.6x better at ~10x less cost.
RETRIEVAL
86.7% on LongMemEval — among the top memory systems. Zero search cost. Fully local.
19 decisive contests across Haiku, Sonnet, and Opus. Ties removed — only clear wins and losses count.
Read the full benchmark report →
How it works
Not a database. A learning system.
01

Brain-inspired memory.

Engrams — learned knowledge that strengthens with use and fades when irrelevant. Stale facts naturally drop out without manual cleanup. Memory injects at every stage — session start, planning, skill use, agent spawning, even subagents. Nothing slips through. READ THE SPEC →

02

One memory, every tool.

Correct something in Claude Code. Cursor already knows. OpenClaw remembers it on Telegram. Hermes picks it up in Python. Multi-store reads your team's shared engrams alongside personal memory — no copying, no syncing. Switch AI models next month; your memory carries over.

03

Don't trust. Verify.

Open the file. Read what your agent remembers. Edit it. Delete it. It's YAML on your disk — not a database, not a cloud, not a black box. Apache-2.0 licensed, forever. Unlike Mem0, Letta, or Zep, nothing leaves your machine unless you choose.

New in 0.7
Knowledge Packs. Share what you know.
What took you six months to teach your agent, someone else installs in six seconds. Export a curated pack, share it with your team or community, and anyone can install it.
EXPORT
plur packs export react-patterns \
  --domain code.react \
  --tags hooks,state
Privacy scan blocks secrets and personal data. SHA256 integrity hash per pack.
INSTALL
plur packs install ./react-patterns.yaml
Conflict detection flags duplicates and contradictions with your existing engrams. Uninstall cleanly.
TEAM SHARING
plur stores add ~/team/engrams.yaml \
  --scope my-team --readonly
Multi-store reads your team's knowledge alongside personal memory. No copying, no syncing. Free.
LIFECYCLE HOOKS
plur init installs 9 hooks. Memory injected at session start, plan mode, skill invocation, agent spawn, subagent start. Observations logged for offline pattern extraction.

Hooks fire 100% of the time vs LLM-driven learning at ~80%.
Get started
Tell your agent to install it.
Free. No account. No migration. Your next session is already smarter.

OpenClaw

One command, fully automatic:
openclaw plugins install @plur-ai/claw
openclaw gateway --force
COPY

Claude Code

One command sets up everything:
npx @plur-ai/cli init
COPY
Sets up local storage and MCP config. Restart Claude Code to activate.

Hermes Agent

One pip install, auto-discovered:
pip install plur-hermes
# Auto-hooks into pre/post LLM calls.
# 16 tools + SKILL.md bundled.
COPY
Just use your tools as usual. PLUR works in the background — corrections and preferences accumulate automatically across every model, tool, and machine. Changes become noticeable within days.
Claude Code
OpenClaw
Cursor
Windsurf
Hermes
Any MCP Client
Where this goes
Private by default. Collective by choice.
Knowledge Packs are live. Export what you've learned, share it with your team or community, and anyone installs it in seconds. Multi-store reads shared engrams alongside personal memory — no copying. Your knowledge stays yours; you choose what to share.
Verify
Port
Future
Private

Verifiable

Your memory lives on your machine. Open the file, see what's there. No platform can revoke it or hold it hostage.

Portable

An AI that remembers your preferences, your patterns, your way of thinking — and carries that across every tool you use.

Future-proof

Your memory outlives any model. Switch from Claude to GPT to whatever comes next. The knowledge transfers.

Private

Nothing shared without your say. Nothing taken without giving back. Private until you choose otherwise.

FOR ENTERPRISE
Need managed deployment?
PLUR is free for everyone — individuals and teams. Knowledge Packs and Multi-Store are included. For organizations that need managed infrastructure and dedicated support, we offer enterprise plans.
The path
From personal memory to collective intelligence.
MEMORY

Your agent remembers.

Corrections stick. Preferences compound. Every session builds on the last. One agent, one developer — a memory that gets better with use, not just bigger.

WISDOM

Your team shares what it knows.

Knowledge Packs and Multi-Store let teams pool hard-won knowledge without giving up ownership. What took one person six months, another installs in six seconds.

COLLECTIVE INTELLIGENCE

Every agent lifts every other.

A network of sovereign agents, each choosing what to share. Not a hive mind — a chorus. The future of intelligence is not artificial. It is plural.

The Summit
Memory. Wisdom.
Collective intelligence.
The future of intelligence is not artificial. It is plural.
Subscribe your agent to the newsletter. Humans allowed too.