-
Notifications
You must be signed in to change notification settings - Fork 9
Open
Description
def check_litellm_providers(self, provider: str) -> bool:
# Judge whether the provider is supported by LiteLLM, and if provider name should be added to the model name
if provider in ['azure', 'azure_ai', 'anthropic', 'deepseek', 'sagemaker', 'bedrock', 'vertex_ai', 'vertex_ai_beta', 'palm', 'gemini', 'mistral', 'cloudflare', 'huggingface', 'replicate', 'together_ai', 'openrouter', 'baseten', 'nlp_cloud', 'petals', 'ollama', 'perplexity', 'groq', 'anyscale', 'watsonx', 'voyage', 'xinference', 'nvidia_nim']:
# provider name should be added to the model name
return True, 1
elif provider in ['openai', 'cohere', 'ai21', 'deepinfra', 'ai21', 'alpha_alpha']:
# provider name should not be added to the model name
return True, 2
elif provider in ['xai', 'qwen']:
# append OpenAI as procall as OpenAI campatible API
return True, 3
else:
return False, 0
这个函数, openai 应该也放在第一批处理, return True, 1
否则当直接传入模型名称时, 比如gpt-4o 会报错 LLM Provider NOT provided
除非model传入openai/gpt-4o, 这样重复太麻烦
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels