Skip to content

provider 为 openai 时处理有问题 #7

@aizimuji

Description

@aizimuji
  def check_litellm_providers(self, provider: str) -> bool:
      # Judge whether the provider is supported by LiteLLM, and if provider name should be added to the model name
      if provider in ['azure', 'azure_ai', 'anthropic', 'deepseek', 'sagemaker', 'bedrock', 'vertex_ai', 'vertex_ai_beta', 'palm', 'gemini', 'mistral', 'cloudflare', 'huggingface', 'replicate', 'together_ai', 'openrouter', 'baseten', 'nlp_cloud', 'petals', 'ollama', 'perplexity', 'groq', 'anyscale', 'watsonx', 'voyage', 'xinference', 'nvidia_nim']:
          # provider name should be added to the model name
          return True, 1
      elif provider in ['openai', 'cohere', 'ai21', 'deepinfra', 'ai21', 'alpha_alpha']:
          # provider name should not be added to the model name
          return True, 2
      elif provider in ['xai', 'qwen']:
          # append OpenAI as procall as OpenAI campatible API
          return True, 3
      else:
          return False, 0

这个函数, openai 应该也放在第一批处理, return True, 1

否则当直接传入模型名称时, 比如gpt-4o 会报错 LLM Provider NOT provided

除非model传入openai/gpt-4o, 这样重复太麻烦

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions