You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is an inconsistency between the documentation and the provided cURL command for configuring the AI Proxy plugin for Mistral. The documentation states:
Enable and configure the AI Proxy plugin for Mistral (using ollama format in this example):"
However, the command uses the openai format instead of ollama:
Where is the problem?
https://docs.konghq.com/hub/kong-inc/ai-proxy/how-to/llm-provider-integration-guides/mistral/
What happened?
There is an inconsistency between the documentation and the provided cURL command for configuring the AI Proxy plugin for Mistral. The documentation states:
However, the command uses the
openai
format instead ofollama
:This discrepancy can lead to confusion.
What did you expect to happen?
The documentation should correctly reflect the format used in the command.
If the format
openai
is correct, the documentation should state:If
ollama
is correct, the command should be updated accordingly.Code of Conduct and Community Expectations
The text was updated successfully, but these errors were encountered: