-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
vllm cpu docker. Error: TypeError: invalidInputError() missing 1 required positional argument: 'errMsg' #11741
Comments
此外还尝试了 request.post 请求,同样也报错了 request.post 请求代码:
服务端报错和上面 一样。 请求段报错如下:
|
另外还直接使用了 文档中提供的 curl 请求,同样也报错了 请求代码:
服务端请求报错类型:
|
设备类型:
|
Hi @Starrylun,
BTW, the error message here indicates |
Hi @Starrylun, |
thanks, update the latest images, it works fine. |
Solved. Please close this issue. |
参考文章:vLLM Serving with IPEX-LLM on Intel CPU via Docker
参考链接:https://github.com/intel-analytics/ipex-llm/blob/main/docs/mddocs/DockerGuides/vllm_cpu_docker_quickstart.md#vllm-serving-with-ipex-llm-on-intel-cpu-via-docker
服务启动代码:
start-vllm-service.sh 文件为
请求代码:
vllm服务端报错结果展示:
openai 请求端报错展示:
The text was updated successfully, but these errors were encountered: