Skip to content

fix(ai): buffer streaming preamble text before code fence appears#8911

Open
temrjan wants to merge 1 commit intomarimo-team:mainfrom
temrjan:fix/ai-cell-preamble-text
Open

fix(ai): buffer streaming preamble text before code fence appears#8911
temrjan wants to merge 1 commit intomarimo-team:mainfrom
temrjan:fix/ai-cell-preamble-text

Conversation

@temrjan
Copy link
Copy Markdown

@temrjan temrjan commented Mar 28, 2026

Summary

  • Buffer incoming AI stream chunks in CellCreationStream until a code fence (```) appears, preventing conversational preamble from becoming a separate Python cell
  • Flush buffered content as a cell in stop() if the stream ends without any fence (backward compatibility for models that return code without fences)
  • Add tests for: preamble + fence, code without fence, fence from first chunk

Root cause

CellCreationStream.stream() called codeToCells(buffer) on every chunk. Before any fence arrived, codeToCells treated the entire buffer as Python code (line 166 of completion-utils.ts: if (!code.includes("```")) return [{ language: "python", code }]), creating a cell from preamble text like "I'll create a fibonacci function...".

What this does NOT fix

The "Inline AI edit" flow (problem #2 in the issue) uses a different code path through the backend without_wrapping_backticks function. That is a separate issue.

Test plan

  • All 25 staged-cells tests pass
  • TypeScript clean (no new errors)
  • Test: preamble text buffered, cell created only when fence arrives
  • Test: code without fence → cell created on stream end (backward compat)
  • Test: fence in first chunk → cell created immediately (no delay)

Fixes #8880

🤖 Generated with Claude Code

When AI models emit conversational preamble before code fences
(e.g. "I'll create a cell that..."), the preamble was incorrectly
created as a separate Python cell. This happened because
CellCreationStream called codeToCells on partial buffer before
any fence arrived, treating plain text as Python code.

Now CellCreationStream buffers incoming chunks until a code fence
(```) appears. Once a fence is found, codeToCells correctly extracts
only the code. If the stream ends without any fence, the buffer is
flushed as a cell on stop() for backward compatibility.

Note: This fixes the "Generate AI cell" flow. The "Inline AI edit"
flow (backend without_wrapping_backticks) is a separate issue.

Fixes marimo-team#8880

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@vercel
Copy link
Copy Markdown

vercel bot commented Mar 28, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
marimo-docs Ready Ready Preview, Comment Mar 28, 2026 7:06pm

Request Review

@github-actions
Copy link
Copy Markdown

github-actions bot commented Mar 28, 2026

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

@temrjan
Copy link
Copy Markdown
Author

temrjan commented Mar 28, 2026

I have read the CLA Document and I hereby sign the CLA

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

AI cell generation emits conversational text as Python code

1 participant