Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Azure OpenAI env variables missing? #36

Open
flexchar opened this issue May 5, 2023 · 4 comments
Open

Azure OpenAI env variables missing? #36

flexchar opened this issue May 5, 2023 · 4 comments
Labels
enhancement New feature or request

Comments

@flexchar
Copy link

flexchar commented May 5, 2023

Hi! This is a fudging awesome piece of software you've written. Thank you.

I've been trying to spin up locally since I need to perform testing across different providers (Cohere, Azure, etc).

I cannot seem to find environment variables for Azure. Only OpenAI directly.

Could you please update documentation to show how? :)

@okisdev
Copy link
Owner

okisdev commented May 5, 2023

Hi there. Thank you for being attracted to this project.

Currently, global environment variables are only supported for OpenAI. if you want to use Azure. you can set it up in your interface by clicking the settings icon in the bottom left corner and selecting Azure as the service provider.

Once set up, the relevant settings will be stored in your local storage. They will not be lost until be cleared.

However, this is limited to your use only and if you want other people who visit your site to share Azure AI services, unfortunately, this is not currently possible, however, it is planned in the future.

We think it would be a dangerous thing that everyone who visits your site could use the AI service directly, so, at a later date, we will also remove the OpenAI environment variables and use a way to manage the relevant keys in the dashboard.

@flexchar
Copy link
Author

flexchar commented May 7, 2023

I understand that is a great way for individual users. I work in a company where I'm a developer in a team that is not a technical team besides me. We're often evaluating and integrating AI into our workflows.

As such, I have always wanted something like ChatGPT where I could provide my own prompts/templates and use different models as I supply our company's access to Anthropic, Azure OpenAI.

Your work seems to check out 90% of my wish list however I don't (and I cannot) share API keys with non technical users around the office. Of course I understand it's not within your scope of work, so feel free to close the ticket. :)

We think it would be a dangerous thing that everyone who visits your site could use the AI service directly, so, at a later date, we will also remove the OpenAI environment variables and use a way to manage the relevant keys in the dashboard.

To clarify, this is for internal and private use only. My hosting would never be exposed outside the office's network.

@okisdev
Copy link
Owner

okisdev commented May 7, 2023

Thank you for your support.

In fact, it is the right of any user, whether a developer or not, to use this software as long as it is allowed by the AGPL-3.0 LICENSE and under the terms of the relevant AI service provider.

The purpose of this software was developed is to make better use of AI services, no matter which provider they are provided by.

Thanks again, I will consider this case in depth and meet it in the best possible way.

@okisdev okisdev added the enhancement New feature or request label May 7, 2023
@ishaan-jaff
Copy link

Hi @flexchar I believe we can help with this issue. I’m the maintainer of LiteLLM https://github.com/BerriAI/litellm

TLDR:
We allow you to use any LLM as a drop in replacement for gpt-3.5-turbo.
If you don't have access to the LLM you can use the LiteLLM proxy to make requests to the LLM

You can use LiteLLM in the following ways:

With your own API KEY:

This calls the provider API directly

from litellm import completion
import os
## set ENV variables 
os.environ["OPENAI_API_KEY"] = "your-key" # 
os.environ["COHERE_API_KEY"] = "your-key" # 

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages=messages)

Using the LiteLLM Proxy with a LiteLLM Key

this is great if you don’t have access to claude but want to use the open source LiteLLM proxy to access claude

from litellm import completion
import os

## set ENV variables 
os.environ["OPENAI_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your openai key
os.environ["COHERE_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your cohere key

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages=messages)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants