-
Notifications
You must be signed in to change notification settings - Fork 85
使用vllm serve运行时报错 #26
Copy link
Copy link
Open
Description
报错信息如下:
ValueError: Loading moonshotai/Moonlight-16B-A3B-Instruct requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option trust_remote_code=True to remove this error.
继续使用命令:vllm serve "moonshotai/Moonlight-16B-A3B-Instruct" trust_remote_code=True运行时出现如下报错:
INFO 04-14 13:59:19 [init.py:239] Automatically detected platform cuda.
usage: vllm [-h] [-v] {chat,complete,serve,bench} ...
vllm: error: unrecognized arguments: trust_remote_code=True
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels