Skip to content

Fix: show processing indicator during pre-token latency (turnInFlight)#190

Closed
sakshyasinha wants to merge 3 commits into
cuga-project:mainfrom
sakshyasinha:fix/turn-inflight-indicator
Closed

Fix: show processing indicator during pre-token latency (turnInFlight)#190
sakshyasinha wants to merge 3 commits into
cuga-project:mainfrom
sakshyasinha:fix/turn-inflight-indicator

Conversation

@sakshyasinha
Copy link
Copy Markdown

@sakshyasinha sakshyasinha commented May 4, 2026

Show a processing indicator as soon as the user submits a message (before first tokens arrive) by introducing a turnInFlight state in the frontend stream manager.

Changes:

  • Added turnInFlight and getIsProcessing() to StreamManager.tsx.
  • Stop button uses the combined processing state.
  • Set/clear turnInFlight in customSendMessage.ts and StreamingWorkflow.ts.

This reduces pre-token UI blankness and prevents users from retrying while the agent is actually working.

Summary by CodeRabbit

  • Bug Fixes

    • Consolidated processing state so the app reliably shows when any turn or stream is active, preventing stale status.
    • Stop button now appears and responds correctly for all in-progress operations, not just active streaming.
  • UX

    • Background-driven streaming reliably signals completion so UI elements update promptly when work finishes.
  • Documentation

    • Added a demo recording guide showing how to capture the new turn-in-flight behavior.
  • Tests

    • Added a Playwright script to record demo videos for QA.

Add turnInFlight state to StreamManager, use combined processing state in StopButton, and set/clear turnInFlight around message send and background streaming to show UI during pre-token latency.
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented May 4, 2026

📝 Walkthrough

Walkthrough

A combined processing state (turnInFlight || isStreaming) was added. StreamStateManager gained turn-in-flight tracking and getIsProcessing(); subscribers now receive the combined boolean. StreamingWorkflow and send-message paths set/clear the turn-in-flight flag and await background completion. StopButton now uses the combined processing state.

Changes

Processing State and Turn-In-Flight Tracking

Layer / File(s) Summary
State Management Infrastructure
src/frontend_workspaces/agentic_chat/src/StreamManager.tsx
StreamStateListener now receives isProcessing (`turnInFlight
Core Background Flow
src/frontend_workspaces/agentic_chat/src/StreamingWorkflow.ts
streamViaBackground sets setTurnInFlight(true) early, establishes a completionPromise/resolver, resolves it on agent_complete/agent_error and on dispatch errors, removes the message listener, awaits completion, and clears the turn-in-flight flag in finally.
Message Sending Integration
src/frontend_workspaces/agentic_chat/src/customSendMessage.ts
customStreamMessage and customSendMessage import streamStateManager and wrap streaming calls in try/finally calling setTurnInFlight(true) before and setTurnInFlight(false) in finally.
UI Wiring
src/frontend_workspaces/agentic_chat/src/floating/stop_button.tsx
StopButton initializes from streamStateManager.getIsProcessing() and subscribes via streamStateManager.subscribe(...), and conditionally renders based on isProcessing.
Demo and Tooling
docs/DEMO_RECORDING.md, scripts/playwright/record_demo.spec.js
Added demo recording doc and Playwright script to capture a recording demonstrating the turn-in-flight / streaming UI behavior.

Sequence Diagram

sequenceDiagram
    participant User
    participant UI as StopButton
    participant Send as customSendMessage / customStreamMessage
    participant Manager as StreamStateManager
    participant Workflow as StreamingWorkflow
    participant BG as Background/Runtime

    User->>Send: send message
    Send->>Manager: setTurnInFlight(true)
    Manager->>UI: notify(isProcessing=true)
    UI->>UI: render stop button

    Send->>Workflow: streamViaBackground() / fetchStreamingData()
    Workflow->>Manager: setTurnInFlight(true)
    Workflow->>BG: dispatch background message

    BG->>BG: process agent request
    BG->>Workflow: agent_complete / agent_error
    Workflow->>Workflow: resolve completionPromise
    Workflow->>Manager: setTurnInFlight(false)
    Manager->>UI: notify(isProcessing=false)
    UI->>UI: hide stop button

    Send->>Manager: setTurnInFlight(false) (finally)
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related issues

Poem

🐰 I hop on code and hold my breath,
Turn-in-flight and stream, now one light,
Buttons bloom when work begins, then rest,
A tidy hush when processes are right.

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: adding a processing indicator to show activity during pre-token latency using the turnInFlight flag.
Linked Issues check ✅ Passed Check skipped because no linked issues were found for this pull request.
Out of Scope Changes check ✅ Passed Check skipped because no linked issues were found for this pull request.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Tip

💬 Introducing Slack Agent: The best way for teams to turn conversations into code.

Slack Agent is built on CodeRabbit's deep understanding of your code, so your team can collaborate across the entire SDLC without losing context.

  • Generate code and open pull requests
  • Plan features and break down work
  • Investigate incidents and troubleshoot customer tickets together
  • Automate recurring tasks and respond to alerts with triggers
  • Summarize progress and report instantly

Built for teams:

  • Shared memory across your entire org—no repeating context
  • Per-thread sandboxes to safely plan and execute work
  • Governance built-in—scoped access, auditability, and budget controls

One agent for your entire SDLC. Right inside Slack.

👉 Get started


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/frontend_workspaces/agentic_chat/src/StreamManager.tsx (1)

59-75: ⚠️ Potential issue | 🟠 Major | ⚡ Quick win

Clear local processing state before awaiting the /stop request.

On Line 60 through Line 75, stopStream() waits for apiFetch('/stop') before clearing local flags. If that request is slow/fails, the UI can remain stuck in “processing” even after the user pressed Stop.

Proposed fix
  async stopStream() {
    if (this.currentAbortController) {
      this.currentAbortController.abort();
    }

+   // Clear local UI state immediately.
+   this.setStreaming(false);
+   this.setTurnInFlight(false);
+   this.setAbortController(null);
+
    try {
      const response = await apiFetch('/stop', {
        method: "POST",
        headers: {
          "Content-Type": "application/json",
        },
      });

      if (!response.ok) {
        console.error("Failed to stop stream on server");
      }
    } catch (error) {
      console.error("Error stopping stream:", error);
    }
-
-   this.setStreaming(false);
-   this.setTurnInFlight(false);
  }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/frontend_workspaces/agentic_chat/src/StreamManager.tsx` around lines 59 -
75, In stopStream(), clear the local UI processing state before waiting for the
server call by calling this.setStreaming(false) and this.setTurnInFlight(false)
prior to invoking or awaiting apiFetch('/stop'); keep the try/catch around the
apiFetch('/stop') call (so errors are still logged) but ensure the UI flags are
cleared first (or fire-and-forget the apiFetch) so the UI doesn't remain stuck
if the request is slow or fails. Use the existing methods this.setStreaming,
this.setTurnInFlight, and apiFetch('/stop') to implement this change.
🧹 Nitpick comments (1)
src/frontend_workspaces/agentic_chat/src/customSendMessage.ts (1)

139-148: ⚡ Quick win

Use a single owner for turnInFlight on the background path.

customStreamMessage toggles turnInFlight, but streamViaBackground now does the same. Keeping both creates redundant state transitions and makes lifecycle ownership harder to reason about.

Proposed simplification
-    try {
-      streamStateManager.setTurnInFlight(true);
-      switch (request.input.text) {
-        default:
-          await streamViaBackground(instance, request.input.text || "");
-          break;
-      }
-    } finally {
-      streamStateManager.setTurnInFlight(false);
-    }
+    switch (request.input.text) {
+      default:
+        await streamViaBackground(instance, request.input.text || "");
+        break;
+    }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/frontend_workspaces/agentic_chat/src/customSendMessage.ts` around lines
139 - 148, The code currently sets
streamStateManager.setTurnInFlight(true/false) around every send but
streamViaBackground already manages turnInFlight, causing duplicate toggles;
adjust ownership so only one place controls it. Remove the outer try/finally
calls to streamStateManager.setTurnInFlight in customSendMessage.ts around the
switch and either (a) let streamViaBackground fully manage turnInFlight, or (b)
only set turnInFlight in customStreamMessage for non-background cases; ensure
streamViaBackground continues to call streamStateManager.setTurnInFlight(true)
at start and setTurnInFlight(false) when finished so there’s a single owner of
the turnInFlight lifecycle.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/frontend_workspaces/agentic_chat/src/StreamingWorkflow.ts`:
- Around line 588-592: The completionPromise (with completionResolver) can
remain unresolved when sendMessage returns { type: "error" } or rejects, leaving
the listener attached and processing stuck; update the sendMessage call paths in
StreamingWorkflow (the branches handling sendMessage's success/error and its
.catch) to always call completionResolver() and remove the message listener
before returning or throwing, and for safety wrap the background work in a
try/finally (or ensure every early-return/error branch) so completionResolver is
invoked and the listener detached in all outcomes (references:
completionPromise, completionResolver, sendMessage, and the message listener
registration).

---

Outside diff comments:
In `@src/frontend_workspaces/agentic_chat/src/StreamManager.tsx`:
- Around line 59-75: In stopStream(), clear the local UI processing state before
waiting for the server call by calling this.setStreaming(false) and
this.setTurnInFlight(false) prior to invoking or awaiting apiFetch('/stop');
keep the try/catch around the apiFetch('/stop') call (so errors are still
logged) but ensure the UI flags are cleared first (or fire-and-forget the
apiFetch) so the UI doesn't remain stuck if the request is slow or fails. Use
the existing methods this.setStreaming, this.setTurnInFlight, and
apiFetch('/stop') to implement this change.

---

Nitpick comments:
In `@src/frontend_workspaces/agentic_chat/src/customSendMessage.ts`:
- Around line 139-148: The code currently sets
streamStateManager.setTurnInFlight(true/false) around every send but
streamViaBackground already manages turnInFlight, causing duplicate toggles;
adjust ownership so only one place controls it. Remove the outer try/finally
calls to streamStateManager.setTurnInFlight in customSendMessage.ts around the
switch and either (a) let streamViaBackground fully manage turnInFlight, or (b)
only set turnInFlight in customStreamMessage for non-background cases; ensure
streamViaBackground continues to call streamStateManager.setTurnInFlight(true)
at start and setTurnInFlight(false) when finished so there’s a single owner of
the turnInFlight lifecycle.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: e3542f72-031f-4102-abfd-bc26fa52d8f9

📥 Commits

Reviewing files that changed from the base of the PR and between ce02fb4 and ab778ad.

📒 Files selected for processing (4)
  • src/frontend_workspaces/agentic_chat/src/StreamManager.tsx
  • src/frontend_workspaces/agentic_chat/src/StreamingWorkflow.ts
  • src/frontend_workspaces/agentic_chat/src/customSendMessage.ts
  • src/frontend_workspaces/agentic_chat/src/floating/stop_button.tsx

Comment thread src/frontend_workspaces/agentic_chat/src/StreamingWorkflow.ts
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/frontend_workspaces/agentic_chat/src/StreamingWorkflow.ts (1)

700-719: ⚠️ Potential issue | 🔴 Critical | ⚡ Quick win

Remove the extra } in the .catch() block.

Line 715 closes the callback one time too many, so this file no longer parses and the await completionPromise / finally block below it becomes invalid.

Suggested fix
     } catch (e) {
       // noop
     }
     (window as any).chrome.runtime.onMessage.removeListener(listener);
-      }
     });
   // Wait until the background signals completion via the listener
   await completionPromise;
#!/bin/bash
set -e
sed -n '700,719p' src/frontend_workspaces/agentic_chat/src/StreamingWorkflow.ts | nl -ba
if command -v biome >/dev/null 2>&1; then
  biome check src/frontend_workspaces/agentic_chat/src/StreamingWorkflow.ts
fi
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/frontend_workspaces/agentic_chat/src/StreamingWorkflow.ts` around lines
700 - 719, The .catch callback for the dispatch promise has an extra closing
brace that prematurely ends the function and breaks the subsequent await
completionPromise / finally block; open the .catch handler around the error
handling that references completionResolver, window.aiSystemInterface and
listener (the catch that logs "Failed to dispatch agent_query" and calls
window.aiSystemInterface.addStep / setProcessingComplete,
completionResolver?.(), and removes the runtime message listener) and remove the
stray closing '}' so the .catch block properly closes only once and the
surrounding async flow (await completionPromise and the finally block) parses
correctly.
♻️ Duplicate comments (1)
src/frontend_workspaces/agentic_chat/src/StreamingWorkflow.ts (1)

669-721: ⚠️ Potential issue | 🟠 Major | ⚡ Quick win

Keep a final listener teardown in the outer finally.

Line 669 registers a global runtime listener, but Lines 719-721 do not unregister it. If anything throws after registration and before one of the async completion branches runs, that listener leaks into the next turn.

Suggested fix
   await completionPromise;
   } finally {
+    (window as any).chrome.runtime.onMessage.removeListener(listener);
     // Ensure we clear the in-flight flag when this function completes.
     streamStateManager.setTurnInFlight(false);
   }
#!/bin/bash
python - <<'PY'
from pathlib import Path

path = Path("src/frontend_workspaces/agentic_chat/src/StreamingWorkflow.ts")
lines = path.read_text().splitlines()

for n in [669, 642, 659, 697, 714, 719, 720, 721]:
    print(f"{n}: {lines[n-1]}")
PY
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/frontend_workspaces/agentic_chat/src/StreamingWorkflow.ts` around lines
669 - 721, The runtime message listener registered as (window as
any).chrome.runtime.onMessage.addListener(listener) can leak if an exception
occurs before the branch that removes it; update the outer finally block that
runs after awaiting completionPromise (the finally that currently calls
streamStateManager.setTurnInFlight(false)) to also call (window as
any).chrome.runtime.onMessage.removeListener(listener) so the listener is always
torn down; ensure you reference the same listener variable used in sendMessage's
.then/.catch branches and keep the removal idempotent/safe if the listener was
already removed.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Outside diff comments:
In `@src/frontend_workspaces/agentic_chat/src/StreamingWorkflow.ts`:
- Around line 700-719: The .catch callback for the dispatch promise has an extra
closing brace that prematurely ends the function and breaks the subsequent await
completionPromise / finally block; open the .catch handler around the error
handling that references completionResolver, window.aiSystemInterface and
listener (the catch that logs "Failed to dispatch agent_query" and calls
window.aiSystemInterface.addStep / setProcessingComplete,
completionResolver?.(), and removes the runtime message listener) and remove the
stray closing '}' so the .catch block properly closes only once and the
surrounding async flow (await completionPromise and the finally block) parses
correctly.

---

Duplicate comments:
In `@src/frontend_workspaces/agentic_chat/src/StreamingWorkflow.ts`:
- Around line 669-721: The runtime message listener registered as (window as
any).chrome.runtime.onMessage.addListener(listener) can leak if an exception
occurs before the branch that removes it; update the outer finally block that
runs after awaiting completionPromise (the finally that currently calls
streamStateManager.setTurnInFlight(false)) to also call (window as
any).chrome.runtime.onMessage.removeListener(listener) so the listener is always
torn down; ensure you reference the same listener variable used in sendMessage's
.then/.catch branches and keep the removal idempotent/safe if the listener was
already removed.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 8e99b96c-be7b-4ee2-b02c-46aa51cc65cb

📥 Commits

Reviewing files that changed from the base of the PR and between ab778ad and 9b93bff.

📒 Files selected for processing (1)
  • src/frontend_workspaces/agentic_chat/src/StreamingWorkflow.ts

@sami-marreed
Copy link
Copy Markdown
Contributor

Can you record a demo showcasing this?

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
scripts/playwright/record_demo.spec.js (1)

18-21: ⚡ Quick win

Replace fixed timing waits with state-based waits for a stable demo.

networkidle + fixed sleeps (1s, 12s) can make recordings flaky across machines. Waiting for concrete UI states (input ready, processing indicator shown, first response token shown) will make the demo deterministic and aligned with the PR goal.

Also applies to: 35-36

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@scripts/playwright/record_demo.spec.js` around lines 18 - 21, Replace the
fixed sleeps after page.goto and later waits (uses of page.waitForTimeout) with
state-based waits: use page.waitForSelector or page.waitForFunction to wait for
the input to be ready (e.g., await
page.waitForSelector('input[your-input-selector]:not([disabled])')), wait for
the processing indicator to appear/disappear (e.g., await
page.waitForSelector('.processing') / await page.waitForSelector('.processing',
{ state: 'hidden' })), and wait for the first response token or message to
render (e.g., await page.waitForSelector('.response .token:first-child')).
Replace the 1s/12s waits around page.goto and later in the script with these
selector/function-based waits so the demo is deterministic and stable across
machines.
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

Inline comments:
In `@scripts/playwright/record_demo.spec.js`:
- Around line 11-43: Wrap the main flow (creating context/page, navigation,
typing, waiting) in a try/finally so cleanup always runs: put the existing steps
that use browser, context, page and inputSelector inside try, and in finally
check for and await context.close() and browser.close() (guarding for
undefined/null) to ensure the video is finalized; move or obtain the video path
at a point that is safe (after closing/finalizing the context if needed) and
handle errors from page.video().path() gracefully in the finally block.

---

Nitpick comments:
In `@scripts/playwright/record_demo.spec.js`:
- Around line 18-21: Replace the fixed sleeps after page.goto and later waits
(uses of page.waitForTimeout) with state-based waits: use page.waitForSelector
or page.waitForFunction to wait for the input to be ready (e.g., await
page.waitForSelector('input[your-input-selector]:not([disabled])')), wait for
the processing indicator to appear/disappear (e.g., await
page.waitForSelector('.processing') / await page.waitForSelector('.processing',
{ state: 'hidden' })), and wait for the first response token or message to
render (e.g., await page.waitForSelector('.response .token:first-child')).
Replace the 1s/12s waits around page.goto and later in the script with these
selector/function-based waits so the demo is deterministic and stable across
machines.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 8a79bfb5-ff79-437e-9a03-3d39713765cd

📥 Commits

Reviewing files that changed from the base of the PR and between 9b93bff and b6bb6a5.

📒 Files selected for processing (2)
  • docs/DEMO_RECORDING.md
  • scripts/playwright/record_demo.spec.js
✅ Files skipped from review due to trivial changes (1)
  • docs/DEMO_RECORDING.md

Comment on lines +11 to +43
const browser = await chromium.launch({ headless: true });
const context = await browser.newContext({
recordVideo: { dir: outDir, size: { width: 1280, height: 720 } },
});
const page = await context.newPage();

console.log('Opening', url);
await page.goto(url, { waitUntil: 'networkidle' });

// Short waits to let the app bootstrap. Adjust selectors if your app differs.
await page.waitForTimeout(1000);

// If your app has a chat input with a known selector, update this.
const inputSelector = 'textarea, input[type=text], [contenteditable="true"]';
await page.waitForSelector(inputSelector, { timeout: 5000 });

// Focus the input and type a query that triggers the fake stream
await page.click(inputSelector);
await page.keyboard.type('Show processing indicator demo');

// Submit — try Enter key; if your UI needs a button, change selector accordingly
await page.keyboard.press('Enter');

// Wait long enough to capture pre-token latency + streaming
console.log('Recording demo — waiting 12s to capture pre-token phase and streaming');
await page.waitForTimeout(12000);

// Stop and save video
const video = await page.video().path();
console.log('Video saved to:', video);

await browser.close();
console.log('Done. Files in', outDir);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major | ⚡ Quick win

Ensure browser/context are always closed, even on failures.

If any step throws (navigation, selector, typing), cleanup is skipped and the recording may not finalize reliably. Wrap the flow in try/finally and close context/browser in finally.

Suggested fix
 (async () => {
   const url = process.argv[2] || process.env.DEMO_URL || 'http://localhost:3000';
   const outDir = 'demo-recordings';
   if (!fs.existsSync(outDir)) fs.mkdirSync(outDir, { recursive: true });

   const browser = await chromium.launch({ headless: true });
-  const context = await browser.newContext({
-    recordVideo: { dir: outDir, size: { width: 1280, height: 720 } },
-  });
-  const page = await context.newPage();
-
-  console.log('Opening', url);
-  await page.goto(url, { waitUntil: 'networkidle' });
+  const context = await browser.newContext({
+    recordVideo: { dir: outDir, size: { width: 1280, height: 720 } },
+  });
+  try {
+    const page = await context.newPage();
+    console.log('Opening', url);
+    await page.goto(url, { waitUntil: 'networkidle' });

-  // Short waits to let the app bootstrap. Adjust selectors if your app differs.
-  await page.waitForTimeout(1000);
+    // Short waits to let the app bootstrap. Adjust selectors if your app differs.
+    await page.waitForTimeout(1000);

-  // If your app has a chat input with a known selector, update this.
-  const inputSelector = 'textarea, input[type=text], [contenteditable="true"]';
-  await page.waitForSelector(inputSelector, { timeout: 5000 });
+    // If your app has a chat input with a known selector, update this.
+    const inputSelector = 'textarea, input[type=text], [contenteditable="true"]';
+    await page.waitForSelector(inputSelector, { timeout: 5000 });

-  // Focus the input and type a query that triggers the fake stream
-  await page.click(inputSelector);
-  await page.keyboard.type('Show processing indicator demo');
+    // Focus the input and type a query that triggers the fake stream
+    await page.click(inputSelector);
+    await page.keyboard.type('Show processing indicator demo');

-  // Submit — try Enter key; if your UI needs a button, change selector accordingly
-  await page.keyboard.press('Enter');
+    // Submit — try Enter key; if your UI needs a button, change selector accordingly
+    await page.keyboard.press('Enter');

-  // Wait long enough to capture pre-token latency + streaming
-  console.log('Recording demo — waiting 12s to capture pre-token phase and streaming');
-  await page.waitForTimeout(12000);
+    // Wait long enough to capture pre-token latency + streaming
+    console.log('Recording demo — waiting 12s to capture pre-token phase and streaming');
+    await page.waitForTimeout(12000);

-  // Stop and save video
-  const video = await page.video().path();
-  console.log('Video saved to:', video);
-
-  await browser.close();
-  console.log('Done. Files in', outDir);
+    // Stop and save video
+    const video = await page.video().path();
+    console.log('Video saved to:', video);
+    console.log('Done. Files in', outDir);
+  } finally {
+    await context.close();
+    await browser.close();
+  }
 })();
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const browser = await chromium.launch({ headless: true });
const context = await browser.newContext({
recordVideo: { dir: outDir, size: { width: 1280, height: 720 } },
});
const page = await context.newPage();
console.log('Opening', url);
await page.goto(url, { waitUntil: 'networkidle' });
// Short waits to let the app bootstrap. Adjust selectors if your app differs.
await page.waitForTimeout(1000);
// If your app has a chat input with a known selector, update this.
const inputSelector = 'textarea, input[type=text], [contenteditable="true"]';
await page.waitForSelector(inputSelector, { timeout: 5000 });
// Focus the input and type a query that triggers the fake stream
await page.click(inputSelector);
await page.keyboard.type('Show processing indicator demo');
// Submit — try Enter key; if your UI needs a button, change selector accordingly
await page.keyboard.press('Enter');
// Wait long enough to capture pre-token latency + streaming
console.log('Recording demo — waiting 12s to capture pre-token phase and streaming');
await page.waitForTimeout(12000);
// Stop and save video
const video = await page.video().path();
console.log('Video saved to:', video);
await browser.close();
console.log('Done. Files in', outDir);
const browser = await chromium.launch({ headless: true });
const context = await browser.newContext({
recordVideo: { dir: outDir, size: { width: 1280, height: 720 } },
});
try {
const page = await context.newPage();
console.log('Opening', url);
await page.goto(url, { waitUntil: 'networkidle' });
// Short waits to let the app bootstrap. Adjust selectors if your app differs.
await page.waitForTimeout(1000);
// If your app has a chat input with a known selector, update this.
const inputSelector = 'textarea, input[type=text], [contenteditable="true"]';
await page.waitForSelector(inputSelector, { timeout: 5000 });
// Focus the input and type a query that triggers the fake stream
await page.click(inputSelector);
await page.keyboard.type('Show processing indicator demo');
// Submit — try Enter key; if your UI needs a button, change selector accordingly
await page.keyboard.press('Enter');
// Wait long enough to capture pre-token latency + streaming
console.log('Recording demo — waiting 12s to capture pre-token phase and streaming');
await page.waitForTimeout(12000);
// Stop and save video
const video = await page.video().path();
console.log('Video saved to:', video);
console.log('Done. Files in', outDir);
} finally {
await context.close();
await browser.close();
}
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@scripts/playwright/record_demo.spec.js` around lines 11 - 43, Wrap the main
flow (creating context/page, navigation, typing, waiting) in a try/finally so
cleanup always runs: put the existing steps that use browser, context, page and
inputSelector inside try, and in finally check for and await context.close() and
browser.close() (guarding for undefined/null) to ensure the video is finalized;
move or obtain the video path at a point that is safe (after closing/finalizing
the context if needed) and handle errors from page.video().path() gracefully in
the finally block.

@sami-marreed
Copy link
Copy Markdown
Contributor

Can you record a demo showcasing this?

i meant like showing the use case not as code

@sami-marreed
Copy link
Copy Markdown
Contributor

If no response closing this in 1 days

@sami-marreed
Copy link
Copy Markdown
Contributor

Closing this pull request to keep the queue tidy: there hasn’t been any follow-up activity here for a while. If you still want these changes merged, feel free to reopen or open a fresh PR against current main and we can pick it up from there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants