Skip to content

fix: use LLM for unique video extension prompts; remove tokens on 400 email-domain-rejected#375

Open
crowwdev wants to merge 5 commits intochenyme:mainfrom
crowwdev:fix/video-llm-prompts-banned-token-removal
Open

fix: use LLM for unique video extension prompts; remove tokens on 400 email-domain-rejected#375
crowwdev wants to merge 5 commits intochenyme:mainfrom
crowwdev:fix/video-llm-prompts-banned-token-removal

Conversation

@crowwdev
Copy link

@crowwdev crowwdev commented Mar 21, 2026

PR Title

fix: use LLM for unique video extension prompts; remove tokens on 400 email-domain-rejected


Summary

Two independent fixes:

  1. Video extension scene repetition — Each extension round now gets a unique LLM-generated continuation prompt instead of reusing the original prompt, preventing repeated scenes in long videos.

  2. Auto-remove banned tokens — When a token refresh returns 400 with email-domain-rejected in the response body, the token is immediately removed from the pool.


Changes

app/services/grok/services/video.py

  • Added _generate_continuation_prompt() using curl_cffi.requests.AsyncSession (no new dependencies)
  • Reads app.api_key from config and sets Authorization header if configured
  • Sets "stream": False explicitly in the LLM request
  • Uses grok-3-fast model with temperature=0.8, max_tokens=60
  • Falls back to original prompt silently if LLM call fails
  • _stream_chain() and _collect_chain() call LLM for each extension round (plan.is_extension == True)

app/services/grok/services/video_extend.py

  • Imports _generate_continuation_prompt from video.py

app/services/token/manager.py

  • In record_fail(): when status_code == 400 and reason contains email-domain-rejected, token is immediately removed from pool
  • In _refresh_one(): same check during cooling token refresh — 400 + email-domain-rejected removes token from pool instantly

Verification

  • 30-second video (5 rounds) generates 5 distinct prompts visible in logs
  • Tokens with email-domain-rejected disappear from pool immediately after next refresh cycle
  • Container starts without errors (no new pip dependencies)
  • Fallback to original prompt works when LLM is unavailable

@crowwdev crowwdev changed the title fix: use LLM for unique video extension prompts; auto-remove email-do… fix: use LLM for unique video extension prompts; remove tokens on 400 email-domain-rejected Mar 21, 2026
jiangmuran added a commit to jiangmuran/grok2api_pro that referenced this pull request Mar 21, 2026
…s and auto-remove email-domain-rejected tokens

- Video extension: Each extension round now gets a unique LLM-generated continuation prompt instead of reusing the original prompt, preventing repeated scenes in long videos
- Token management: When a token refresh returns 400 with 'email-domain-rejected', the token is immediately removed from the pool
- Modified files: video.py, video_extend.py, manager.py

Upstream PR: chenyme#375
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant