-
Notifications
You must be signed in to change notification settings - Fork 191
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Chat streaming is not working with Phi 2 #617
Comments
To use Phi-2 model, run with beam_search_causal_lm, greedy_causal_lm or multinomial_causal_lm. |
@YuChern-Intel This is not mentioned in the documentation see https://github.com/openvinotoolkit/openvino.genai/blob/master/src/README.md. |
I had validated in running the microsoft/phi-2 model with these demos. |
Hi @rupeshs If you want to run chat with phi-2 model, you should add chat_template field into tokenizer_config.json. See phi-3 for example. |
#697 - fix segfault |
Used sample code: https://github.com/openvinotoolkit/openvino.genai/blob/master/samples/python/chat_sample/chat_sample.py
ov_streaming_issue.mp4
The text was updated successfully, but these errors were encountered: