-
Notifications
You must be signed in to change notification settings - Fork 16
Proxy Feature
The proxy feature provides a flexible way to establish communication with existing models by leveraging third-party hosts or proxies. This enables flexible control over data flow between your applications and the models.
To direct all OpenAI communication through your Azure tenant, use the following function at the beginning of your application. This ensures that all OpenAI calls go through Azure.
const { ProxyHelper } = require('intellinode');
ProxyHelper.getInstance().setAzureOpenai(resourceName);
If you want to direct specific connections through Azure, you can send the ProxyHelper
as an object to your calls.
For instance:
azureProxyHelper = new ProxyHelper();
azureProxyHelper.setAzureOpenai(resourceName);
const chatbot = new Chatbot(apiKey, SupportedChatModels.OPENAI, azureProxyHelper);
You can use custom proxies to avoid regional restrictions or connect through free services.
- Update the URL in the following JSON:
const openaiProxyJson = {
"url":"https://api.openai.com",
"completions":"/v1/completions",
"chatgpt":"/v1/chat/completions",
"imagegenerate":"/v1/images/generations",
"embeddings":"/v1/embeddings" }
-
Create a
ProxyHelper
object with the custom JSON: proxyHelper = new ProxyHelper(); proxyHelper.setOpenaiProxyValues(openaiProxyJson) -
Update the module to use the proxy.
- Example to update specific service:
const chatbot = new Chatbot(apiKey, SupportedChatModels.OPENAI, proxyHelper);
- Or call below singleton at the beginning of your application, ensuring all Openai connection go through the custom proxy:
ProxyHelper.getInstance().setOpenaiProxyValues(openaiProxyJson);
Note: Please ensure the proxy service you choose is trusted. Using proxies is your responsibility.
To clean the custom proxy on the application level and direct all connections through the official Openai service:
ProxyHelper.getInstance().setOriginOpenai();