Template repo for a file-based shared-memory workflow.
New agent or new node?
- start with
START-HERE.md - for first setup, also read
INSTALL.md
MotoBotMemoryKit is a starter kit for building a private shared-memory repo for one user, team, or set of machines.
It gives you:
- the folder structure
- the operating model
- the export/promote scripts
- onboarding and smoke-test docs
- a small fake dataset so the workflow is understandable from day one
- a linked sample cluster in canon plus a staged sample import payload
This repo is not meant to become your live shared-memory canon by itself.
Instead:
- you use this kit as the starting point
- you create your own private repo from it
- your machines and agents operate inside that new repo
In other words:
MotoBotMemoryKit= template- your future repo = live memory system
Teach and bootstrap a shared-memory workflow that is:
- file-based
- Git-backed
- local-first
- safe for multiple nodes and multiple agents
MEMORY.md- sample
lessons/*.md - sample
daily/*.md - sample
projects/*.md - sample
imports/<machine-slug>/ - reusable scripts
- onboarding and smoke-test docs
- real project databases
- auth/session state
- runtime queues
- credentials
- private historical memory
- create a new private repo of your own
- initialize it from this kit
- run a smoke test with a temporary node slug
- confirm the exports look sane
- only then start treating it as your real shared-memory repo
If you are starting from zero, read INSTALL.md.
- canonical daily files:
daily/YYYY-MM-DD-<machine-slug>.md
- canonical active snapshots:
projects/active-db-snapshot-<machine-slug>.md
- staged node exports:
imports/<machine-slug>/...
Collisions should fail rather than overwrite canon silently.
Claude, Codex, and any other agent on the same machine are still one node.
That means:
- they may all write into the same local bridge
- they export into the same
imports/<machine-slug>/ - but
pull -> export -> commit -> pushmust be serialized per machine
scripts/export-node-memory.sh <machine-slug>scripts/promote-import.sh <machine-slug>scripts/pull-and-promote.sh <local-machine-slug>scripts/rebuild-indexes.shscripts/normalize-lessons.shscripts/normalize-recent-dailies.shscripts/audit-wiki-metadata.shscripts/bootstrap-from-bridge.sh
scripts/bootstrap-from-bridge.sh is disabled by default.
Use it only with explicit opt-in in a private derived repo, because it can copy raw local bridge content into the Git tree.
This repo demonstrates the workflow. It is not a live shared-memory canon yet. This repo is a stripped example derived from a live shared-memory system. It preserves the working model and scripts, but ships only sample data.
Once a derived repo is live on a real node, add a node-local wrapper script plus cron so export/pull/promote also happens as a safety net.
Suggested cadence for active nodes:
30 1,13 * * * <path-to-node-wrapper>
Suggested wrapper shape:
- fail if repo is dirty
git pull --ff-only- run
scripts/export-node-memory.sh <machine-slug> - commit + push node export changes
- run
scripts/pull-and-promote.sh <machine-slug> - rebuild indexes if canonical dirs changed
- commit + push promotions
- guard with a lock file
This should be shipped as template guidance or helper script in a later refinement pass.