You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I did some test and found latest fastapi 0.109 and starlette 0.35.1 cause the crash,
I reinstall fastapi 0.104 and starlette 0.27.0 it's OK.
For reproduce, you need to stop ray and start over.
After installing env via pip install .[cpu] -f https://developer.intel.com/ipex-whl-stable-cpu -f https://download.pytorch.org/whl/torch_stable.html, I also encountered this problem. fastapi is installed together with ray[serve], I've verified 0.104<=fastapi<=0.109 all works well. The above problem will be fixed by #61.
First create an new conda env with python 3.9, and install with
pip install -e .[bigdl-cpu] -f https://developer.intel.com/ipex-whl-stable-cpu -f https://download.pytorch.org/whl/torch_stable.html
Then run:
python inference/serve.py --config_file inference/models/bigdl/mpt-7b-bigdl.yaml --serve_simple
It looks something is messed after installing bigdl-llm:
After I switched to another env, it's OK.
@KepingYan @jiafuzha
The text was updated successfully, but these errors were encountered: