You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When debugging that method, stream correctly ends up being false, even without --no-stream when not in chat mode.
But it seems that stream=True is always sent to execute() when in chat mode, and using --no-stream via command line will disable the chat mode itself. Here are some test runs with the plugin, vs. OpenAI gpt4 alias. The behavior seems similar.
> llm "hi"
Hello! How can I assist you today?
> llm chat
Error: Sorry, this plugin doesn't support streaming yet. # stream was set to to True by the host
> llm --no-stream chat
Hello! How can I assist you today? # chat mode not applied but there is a greeting
> llm chat -m gpt4
Chatting with gpt-4
Type 'exit' or 'quit' to exit
> llm --no-stream chat -m gpt4
Hello! How can I assist you today? # chat mode not applied but there is a greeting
> llm chat -m gpt4 --no-stream
Usage: llm chat [OPTIONS]
Try 'llm chat --help' for help.
Error: No such option: --no-stream Did you mean --system?
> llm chat -o verbose # the default model does have this option in non-chat mode
Usage: llm chat [OPTIONS]
Try 'llm chat --help' for help.
Error: No such option: -o
Applying the plugin's own announced streaming boolean would fix this for me. Allowing explicit --no-stream while in chat mode is nice but I don't personally have a strong use case to offer for that, other than perhaps consistency.
Do you plan to support model options in chat mode? They are often needed with local models. The new mode is a very welcome addition, so thanks a lot for working on it.
The text was updated successfully, but these errors were encountered:
simonw
changed the title
0.10a0: chat mode and --no-stream mode don't work together / plugin can_stream not applied
chat mode --no-stream option
Sep 10, 2023
I am writing a plugin, and I've understood that this will tell
llm
to always be in --no-stream mode when these models are called:Is this what's needed and all that is needed? The plugin will only run when
stream=False
to manage upstream user expectations.When debugging that method,
stream
correctly ends up being false, even without --no-stream when not in chat mode.But it seems that
stream=True
is always sent toexecute()
when in chat mode, and using --no-stream via command line will disable the chat mode itself. Here are some test runs with the plugin, vs. OpenAI gpt4 alias. The behavior seems similar.Applying the plugin's own announced streaming boolean would fix this for me. Allowing explicit --no-stream while in chat mode is nice but I don't personally have a strong use case to offer for that, other than perhaps consistency.
Do you plan to support model options in chat mode? They are often needed with local models. The new mode is a very welcome addition, so thanks a lot for working on it.
The text was updated successfully, but these errors were encountered: