Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .github/workflows/cr.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,4 +23,5 @@ jobs:

# common
LANGUAGE: English
LOG_LEVEL: debug
LOG_LEVEL: debug
max_tokens: 10000

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding max_tokens: 10000 seems reasonable, but consider:

  • Cost: Increasing the token limit directly impacts the cost of API calls. Is 10000 tokens justified for the expected use cases? Could this lead to unexpected high costs?
  • Context Length of Model: Verify that the underlying LLM supports a context window of at least 10000 tokens. If it doesn't, the request will likely be truncated (and potentially cause unexpected behavior) or fail outright. You should select a maximum token value that is compatible with the model.
  • Default Value/Configuration: Is 10000 the right default? It's generally a good practice to allow configuring this parameter, perhaps via an environment variable or command-line argument, so users can tailor it to their needs and budget.
  • Error Handling: It would be good to include some error handling in the process. For example check if the value of max_tokens is positive.