-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create an ILMProvider interface and have our current implementation use it #17394
Conversation
This comment has been minimized.
This comment has been minimized.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay I'm not gonna block over any of these, since this is clearly an in-progress commit on the way to supporting other providers. But I do feel like we may have one too many layers of abstraction. (maybe it's because I haven't seen the other provider implementations yet)
|
||
// If the AI key and endpoint is still empty, tell the user to fill them out in settings | ||
if (_AIKey.empty() || _AIEndpoint.empty()) | ||
if (_llmProvider) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea this does feel bodgy currently, but there's only one provider for now so it seems totally fine. When there are other providers, it seems like it'd make more sense to have separate errors for "you haven't set up any LLM backend" (generic, from the Extension palette itself) vs "you didn't set up an API key" (from the individual providers)
This comment was marked as outdated.
This comment was marked as outdated.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approving since it's pretty much just moving stuff. The bugs I found also exist in the old version. 😅
// Send the request | ||
try | ||
{ | ||
const auto response = _httpClient.SendRequestAsync(request).get(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FWIW all these get()
s could use co_await
in the future instead. That avoids the resume_background
hassle.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
@@ -57,6 +59,7 @@ namespace Microsoft.Terminal.Settings.Model | |||
|
|||
String AIEndpoint; | |||
String AIKey; | |||
static event AzureOpenAISettingChangedHandler AzureOpenAISettingChanged; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is going to be moved into the new AIConfig
struct in the follow up PR that introduces OpenAI
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A static event? that's weird to me but okay. It's moving soon so I'll look at it there
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A static event
Yep, that's because these strings aren't actually saved in our settings, they're saved in the credential manager. Which means that if there's multiple instances of terminal open (each with their own TerminalChat), and the auth is changed in one of them, this is what we have to emit to get all instances to update.
|
||
// auth related functions | ||
void SetAuthentication(Windows.Foundation.Collections.ValueSet authValues); | ||
event Windows.Foundation.TypedEventHandler<ILMProvider, String> AuthChanged; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
AzureOpenAILLMProvider
doesn't actually emit this event but the Github one is going to, so putting it in already
This comment has been minimized.
This comment has been minimized.
…nto dev/pabhoj/llm_provider_interface
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry it took so long to loop back on this. Let's start merging down!
@@ -57,6 +59,7 @@ namespace Microsoft.Terminal.Settings.Model | |||
|
|||
String AIEndpoint; | |||
String AIKey; | |||
static event AzureOpenAISettingChangedHandler AzureOpenAISettingChanged; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A static event? that's weird to me but okay. It's moving soon so I'll look at it there
@@ -156,6 +156,9 @@ namespace winrt::Microsoft::Terminal::Settings::Model::implementation | |||
|
|||
void ExpandCommands(); | |||
|
|||
static winrt::event_token AzureOpenAISettingChanged(const AzureOpenAISettingChangedHandler& handler); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can this be a static til::event
? I dunno if I've ever done a static event so I genuinely don't know
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Check out ConptyConnection::NewConnection
!
{ | ||
const auto model = jsonResponse.GetNamedString(L"model"); | ||
bool modelIsAccepted{ false }; | ||
for (const auto acceptedModel : acceptedModels) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i'm sure if we didn't just use a string[]
we could do some sort of smarter .contains()
, but also, eh. We accept like 5 models, this is fine.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🦵🏻
🍗
🦿
Summary of the Pull Request
Validation Steps Performed
Everything still works
PR Checklist