-
-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support Azure Open AI #64
Comments
Will look into this |
Have you tried using the custom OpenAI compatible option? |
Hi, Yes, however it doesn't work with the above error logged. Example endpoint URL:
|
Ah sorry you wrote that initially. Is that still the case? I updated the custom OpenAI provider in the last update. |
No problem, yes, its still the case, tried it again today. |
#86 Our situation seems to be the same. Have you solved it yet? |
I believe the API itself is actually slightly different to the standard OpenAI spec which is why it doesn't work. But #86 will most likely use the same service as the PAYG production instances (that I'm trying to use). |
I forcibly modified the BASE URL address for OpenAI Conversation core integration, using url: https://models.inference.ai.azure.com It is available But LLM Vision cannot.😂 |
Looking at Azure OpenAI REST API Reference it is pretty obvious why it doesn't work. Azure OpenAI is not compatible with OpenAI's API. OpenAI:
Azure:
For now I can only recommend using OpenAI directly instead, or if you're looking for a free provider, I believe Gemini has a free tier. |
If it's worth the effort. I hope it can be compatible because Gemini is not available in my region. My GPU no longer has any more VRAM, so I cannot use LLM Vision locally. I cannot afford more bills and more VRAM GPUs. |
@valentinfrlch is there a reason OpenAI's python package isn't used to facilitate communication to OpenAI based LLMs? |
I figured, since there are some other providers I might as well just use http requests for all providers as this is easier to maintain. I am in the process of rewriting the request_helpers.py though. Is there a reason why the python package should be used? |
I see. The package contains support for both standard OpenAI providers (including self hosted/custom environments), and also Azure environments. It shares a common set of functions and exceptions between the two. Generally considered best practice to use SDK' s for a given language if the company/service offers one. |
same here.. Im not able to use the azure open ai.. Will be great to be Able to use it. |
Is your feature request related to a problem? Please describe.
Currently this integration does not support Azure OpenAI which is works slightly differently to a standard OpenAI custom server.
Integration setup fails with
Could not connect to the server. Check you API key or IP and port
Describe the solution you'd like
Support for Azure OpenAI endpoints.
Describe alternatives you've considered
Using a proxy to proxy the requests, however this solution is not as clean as a native solution.
Additional context
N/A
The text was updated successfully, but these errors were encountered: