-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
agentchat_groupchat.ipynb failed #787
Labels
group chat/teams
group-chat-related issues
models
Pertains to using alternate, non-GPT, models (e.g., local models, llama, etc.)
Comments
Can you paste the complete error here? |
|
sonichi
added
the
models
Pertains to using alternate, non-GPT, models (e.g., local models, llama, etc.)
label
Dec 3, 2023
Could be. @kouyanming could you share how do you set up your LLM endpoint? |
Closing this issue due to inactivity. If you have further questions, please open a new issue or join the discussion in AutoGen Discord server: https://discord.com/invite/Yb5gwGVkE5 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
group chat/teams
group-chat-related issues
models
Pertains to using alternate, non-GPT, models (e.g., local models, llama, etc.)
Why do I run agentchat_groupchat.ipynb get an error when I request openai parameter is wrong, only request parameter is the following format {'messages': [{' content ':' you are a mathematician ', 'role' : 'the system'}, {' content ':' who are you, 'role' : 'user'}], 'model' : 'baichuan - chat'}, but much more parameters would be an error.
The text was updated successfully, but these errors were encountered: