-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
API key passthrough missing for custom OpenAI-compatible models #14288
Comments
The intention was to avoid accidental key leakage of Open AI keys to non-Open AI parties. Suggestion:
@dannaf Does this work for you? Workaround for adopters:
|
The first point sounds great, I think it's exactly what should be done: that for each custom model there could be set by the user an (optional) apiKey environment/settings variable. (It would be optional for situations where it is not needed, like a model running locally or whatever you had in mind at the initial implementation.) Avoiding accidental leakage of the OpenAI api key to non-OpenAI models is also correct. I don't think it's an issue at all to require the user of a custom model that requires an api key to explicitly indicate it via an environment/settings variable. There is no need to allow the official OpenAI to be used for custom models when there is an intentionally implemented custom api key specification method via an env/settings variable; I had only suggested that as a temporary workaround, which also was not available, just to further motivate implementing a solution. I don't think it needs to be a feature of an intentional implementation, but if you think it could be a convenience (I am not sure it would be, as the api key for the custom model isbpresumably different) then maybe there should be something like an What exactly do you mean is the default OpenAI behavior that you mentioned? Can you link docs to this? |
The I like the idea of an additional |
The configuration for custom OpenAI models now allows specifying a unique 'apiKey' for each model, or reusing the global OpenAI API key. fixes eclipse-theia#14288
Bug Description:
Theia 1.54.0 claims to support custom OpenAI-compatible models including "in the cloud" but it really only supports non-api-key-protected models, as it does not seem to pass through a custom-model API key environment variable from the AI features settings. So there seems to be no straightforward/non-hacky way to connect an API key requiring cloud-hosted OpenAI-compatible AI model (e.g. Perplexity Pro API).
And there is not even a straightforwardly-hacky way of doing it, without customizing theia code (e.g. by utilizing the official OpenAI API environment variable for non-official/OpenAI-compatible APIs), as the current implementation outright drops the api key for a hardcoded 'no-key' string when calling a custom model; here:
theia/packages/ai-openai/src/node/openai-language-model.ts
Line 188 in c7fb4f5
Steps to Reproduce:
"ai-features.openAiCustom.customOpenAiModels"
but...Additional Information
The text was updated successfully, but these errors were encountered: