Skip to content

fix: AI chat scroll, suggestion parsing, and dialog stability#101

Merged
TerrifiedBug merged 6 commits intomainfrom
fix/ai-chat-bugs
Mar 11, 2026
Merged

fix: AI chat scroll, suggestion parsing, and dialog stability#101
TerrifiedBug merged 6 commits intomainfrom
fix/ai-chat-bugs

Conversation

@TerrifiedBug
Copy link
Owner

Summary

  • Pipeline review tab scroll broken: Replaced Radix ScrollArea with native overflow-y-auto div — ScrollArea's internal viewport wasn't constraining height in flex layouts, causing content to overflow onto the input area
  • AI suggestion cards not rendering: Both VRL and pipeline JSON response parsers now strip markdown code fences before parsing — LLMs sometimes wrap JSON in ```json fences despite prompt instructions
  • VRL editor closing on tab switch: Added onInteractOutside prevention to keep the dialog open when switching browser tabs
  • Sample dropdown size mismatch: Use SelectTrigger size="sm" prop instead of className override (component's data- attribute was winning specificity)
  • Server-side parsing: Both API routes now reuse the same parseVrlChatResponse/parseAiReviewResponse functions instead of inline JSON.parse

Test plan

  • Open AI Pipeline Builder → Review tab, send messages — scroll should work and "New Conversation" button visible
  • VRL AI chat: verify suggestion cards render (not raw JSON) when AI responds
  • VRL editor: switch browser tabs and return — dialog should stay open
  • VRL editor toolbar: sample dropdown should match button height

Two bugs fixed:
- VRL chat route was reusing existing conversations when conversationId
  was null (New Chat), causing old context to persist. Now always creates
  a new conversation, matching the pipeline route behavior.
- Review tab ScrollArea lacked height constraint in flex layout, causing
  content to overflow and hiding the New Conversation button. Added h-0
  to establish proper scroll containment.
LLMs sometimes wrap JSON responses in ```json fences despite instructions
not to. Both VRL and pipeline parsers now strip code fences before parsing,
and server-side routes reuse the same parsing functions.
…tability

- Replace Radix ScrollArea with native overflow-y-auto div in pipeline
  review tab — ScrollArea wasn't constraining height in flex layout
- Use SelectTrigger size="sm" prop instead of className h-8 (component's
  data-[size=default]:h-9 was overriding the class)
- Use onInteractOutside on VRL dialog to prevent closing on tab switch
@greptile-apps
Copy link
Contributor

greptile-apps bot commented Mar 11, 2026

Greptile Summary

This PR fixes four independent UI/UX regressions in the AI chat and VRL editor features: scroll layout in the pipeline review tab, suggestion card rendering when LLMs wrap JSON in markdown code fences, VRL dialog stability on browser tab switches, and a sample dropdown height mismatch.

Key changes:

  • Scroll fix: ScrollArea replaced with a native div using min-h-0 overflow-y-auto — the standard fix for flex children that refuse to shrink below their content height.
  • Code-fence stripping: Both parseAiReviewResponse and parseVrlChatResponse now call an internal stripCodeFences helper before JSON.parse, correctly handling ```json and plain ``` wrappers. The identical helper is implemented separately in each file; extracting it into a shared utility (e.g. src/lib/ai/utils.ts) would prevent the two implementations from diverging.
  • VRL dialog stability: onInteractOutside={(e) => e.preventDefault()} is the correct Radix UI pattern to prevent the dialog from dismissing on browser tab focus changes.
  • Conversation lifecycle change: The VRL chat route now always creates a new conversation when no conversationId is supplied, delegating conversation selection to the client via tRPC. This is an intentional behavioral change documented in the PR description.
  • Both API routes have a now-stale catch comment ("Not valid JSON — store as raw text") — since the parser functions return null instead of throwing, the catch only fires on dynamic import failure, not on malformed JSON.

Confidence Score: 4/5

  • Safe to merge — all changes fix real regressions with correct approaches and no functional bugs introduced.
  • All five fixes are sound: the flex/scroll fix uses the canonical min-h-0 pattern, code-fence stripping is regex-correct for the LLM output patterns described, onInteractOutside prevention is the standard Radix UI approach, and the size="sm" prop correctly beats the CSS specificity fight. The only findings are two stale catch comments (style) and duplicated stripCodeFences logic across two files (maintenance). No correctness bugs, security issues, or data-loss risks.
  • No files require special attention.

Important Files Changed

Filename Overview
src/lib/ai/suggestion-validator.ts Adds stripCodeFences helper and threads it into parseAiReviewResponse. Logic is correct; the regex handles both plain ``` and ```json fences. Function is identical to the one added in vrl-suggestion-types.ts — a shared utility would avoid duplication, but this is a style concern, not a bug.
src/lib/ai/vrl-suggestion-types.ts Adds an identical stripCodeFences helper and uses it in parseVrlChatResponse. Parsing logic is correct. The duplication with suggestion-validator.ts is a minor maintenance concern.
src/app/api/ai/vrl-chat/route.ts Switches from reusing an existing conversation to always creating a new one when no conversationId is supplied — intentional per PR description, as the client now owns conversation selection. Also migrates to parseVrlChatResponse for suggestion parsing. The outer catch comment is now misleading since the inner function no longer throws.
src/app/api/ai/pipeline/route.ts Migrates to parseAiReviewResponse for suggestion parsing; logic is equivalent to before. Same stale catch comment issue as in the VRL chat route.
src/components/flow/ai-pipeline-dialog.tsx Replaces ScrollArea with a native div using min-h-0 overflow-y-auto — a well-known fix for flex children that wouldn't shrink below content height. The min-h-0 override is the key correction.
src/components/vrl-editor/vrl-editor.tsx Adds onInteractOutside prevention to keep the dialog open on browser tab switches, and fixes SelectTrigger to use the size="sm" prop instead of a className height override. Both are correct Radix UI patterns.

Flowchart

%%{init: {'theme': 'neutral'}}%%
flowchart TD
    A[LLM streams response tokens] --> B[fullResponse assembled]
    B --> C{Strip code fences\nstripCodeFences}
    C -->|Plain JSON| D[JSON.parse]
    C -->|```json fence detected| E[Extract inner JSON] --> D
    D -->|valid shape| F[parsedSuggestions = suggestions array]
    D -->|invalid / null| G[parsedSuggestions = null]
    F --> H[Persist AiMessage with suggestions JSON]
    G --> H
    H --> I[Stream done event to client]
Loading

Comments Outside Diff (2)

  1. src/app/api/ai/vrl-chat/route.ts, line 182-184 (link)

    Stale catch comment — inner function no longer throws

    parseVrlChatResponse now returns null on invalid JSON instead of throwing, so the outer catch block will only ever fire if the dynamic import() itself fails (which is not a JSON parsing failure). The comment // Not valid JSON — store as raw text is no longer accurate and could mislead future maintainers into thinking JSON parse errors are being handled here.

  2. src/app/api/ai/pipeline/route.ts, line 192-194 (link)

    Stale catch comment — same issue as in vrl-chat route

    parseAiReviewResponse returns null rather than throwing on invalid input, so the catch here only fires on a dynamic import failure. The comment // Not valid JSON — store as raw text no longer reflects what is actually being caught.

Last reviewed commit: 6fa39bb

@TerrifiedBug TerrifiedBug merged commit 785ef54 into main Mar 11, 2026
3 checks passed
@TerrifiedBug TerrifiedBug deleted the fix/ai-chat-bugs branch March 11, 2026 18:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant