Skip to content

feat: add atomic memory consolidation (Store.Consolidate + MCP mem_consolidate)#246

Open
Snakeblack wants to merge 1 commit intoGentleman-Programming:mainfrom
Snakeblack:feat/atomic-mem-consolidate
Open

feat: add atomic memory consolidation (Store.Consolidate + MCP mem_consolidate)#246
Snakeblack wants to merge 1 commit intoGentleman-Programming:mainfrom
Snakeblack:feat/atomic-mem-consolidate

Conversation

@Snakeblack
Copy link
Copy Markdown

Summary / Resumen

Adds Store.Consolidate() — an atomic operation that soft-deletes N source observations and creates a single new consolidated observation within a single withTx() transaction. Also registers the mem_consolidate MCP tool in ProfileAgent.

Changes

Store Layer

  • Consolidate(idsToDelete []int64, p AddObservationParams) (int64, error): Atomic soft-delete + insert in single withTx()
  • Enqueues SyncOpDelete for each source, SyncOpUpsert for new observation
  • Silently ignores non-existent IDs, rolls back on failure

MCP Layer

  • Registered mem_consolidate in ProfileAgent
  • handleConsolidate() with project auto-detection (same pattern as handleSave)

Tests (4 TDD)

  1. Soft-delete sources + new obs created with sync_id
  2. Atomicity rollback on INSERT failure
  3. 3 delete + 1 upsert = 4 sync mutations
  4. Non-existent IDs silently skipped

Closes #242

Copilot AI review requested due to automatic review settings April 26, 2026 17:43
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds an atomic “memory consolidation” operation to merge multiple observations into a single new observation, and exposes it via a new MCP tool (mem_consolidate) for agent-driven memory garbage collection.

Changes:

  • Introduces Store.Consolidate() to soft-delete N observations and insert 1 consolidated observation within a single withTx() transaction (including sync mutation enqueueing).
  • Registers and implements MCP tool mem_consolidate in the agent profile with project auto-detection.
  • Adds store-layer tests covering soft-delete behavior, atomic rollback, sync mutation enqueueing, and skipping non-existent IDs.

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 3 comments.

File Description
internal/store/store.go Adds Store.Consolidate() transactional soft-delete + insert with sync mutation enqueueing.
internal/mcp/mcp.go Registers mem_consolidate tool and adds request handling/argument parsing.
internal/store/store_test.go Adds TDD tests for consolidate behavior (delete + insert, rollback, sync mutations, missing IDs).

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread internal/store/store.go
Comment on lines +2360 to +2366
// Phase 1: Soft-delete sources (ignore non-existent)
for _, id := range idsToDelete {
obs, err := s.getObservationTx(tx, id)
if err == sql.ErrNoRows {
continue // silently ignore non-existent or already-deleted
}
if err != nil {
Copy link

Copilot AI Apr 26, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

idsToDelete is processed as-is. If the caller passes duplicate IDs, this will enqueue multiple delete sync mutations for the same observation (second UPDATE will be a no-op due to deleted_at IS NULL, but the mutation still gets enqueued). Consider de-duplicating idsToDelete (and optionally filtering out <=0 IDs) before the loop to keep sync mutations consistent and avoid unnecessary work.

Copilot uses AI. Check for mistakes.
Comment thread internal/mcp/mcp.go
Comment on lines +878 to +888
// Parse ids_to_delete from JSON array
rawIDs, _ := req.GetArguments()["ids_to_delete"].([]interface{})
var idsToDelete []int64
for _, raw := range rawIDs {
switch v := raw.(type) {
case float64:
idsToDelete = append(idsToDelete, int64(v))
case json.Number:
n, _ := v.Int64()
idsToDelete = append(idsToDelete, n)
}
Copy link

Copilot AI Apr 26, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The ids_to_delete parsing silently truncates float64 values and ignores json.Number conversion errors, which can lead to surprising behavior for non-integer inputs (e.g., 1.9 becomes 1) and makes it hard to debug bad requests. Since this tool is destructive, consider validating that every element is a positive integer and returning a clear error if any element is invalid/out of range, instead of best-effort parsing.

Copilot uses AI. Check for mistakes.
Comment on lines +6828 to +6830
// Empty session_id should cause the INSERT to fail (NOT NULL constraint)
_, err = s.Consolidate([]int64{id1}, AddObservationParams{
SessionID: "", // this should cause failure
Copy link

Copilot AI Apr 26, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test comment says the INSERT should fail due to a NOT NULL constraint, but an empty string is still non-NULL in SQLite. The failure here is more likely due to the observations.session_id foreign key referencing sessions(id) (or another constraint), so the comment should be updated to reflect the actual reason the insert fails to keep the test self-explanatory.

Suggested change
// Empty session_id should cause the INSERT to fail (NOT NULL constraint)
_, err = s.Consolidate([]int64{id1}, AddObservationParams{
SessionID: "", // this should cause failure
// Empty session_id is invalid and should cause the INSERT to fail.
_, err = s.Consolidate([]int64{id1}, AddObservationParams{
SessionID: "", // invalid session_id; this should cause failure

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat(store): add atomic mem_consolidate tool for memory garbage collection

2 participants