-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat (provider/togetherai): Add TogetherAI provider. #3781
base: main
Are you sure you want to change the base?
Conversation
ec3e6b1
to
feb0fe3
Compare
fcffdd5
to
8c6420c
Compare
8c6420c
to
10915e2
Compare
10915e2
to
6d44bf7
Compare
packages/openai-compatible/src/map-openai-compatible-completion-logprobs.ts
Outdated
Show resolved
Hide resolved
// TODO(shaper): This is really model-specific, move to config or elsewhere? | ||
defaultObjectGenerationMode?: 'json' | 'tool' | undefined; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you make it a constructor parameter (in the options) for the model that is then defined in the providers? (which would have that knowledge)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
By options
do you mean config
e.g. OpenAICompatibleChatLanguageModel
constructor signature is (modelId, settings, config)
?
I'll move it there and see how to expose it to the concrete provider impl which today just passes a settings object (where I had it when you commented), but we could add an options or model-config arg as well.
// TODO(shaper): Review vs. OpenAI impl here. | ||
throw new UnsupportedFunctionalityError({ | ||
functionality: 'object-json mode', | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
copy what openai has including schemas
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done, I omitted structuredOutputs
since AFAIK it's only supported by OpenAI today.
constructor( | ||
modelId: OpenAICompatibleChatModelId, | ||
settings: OpenAICompatibleChatSettings, | ||
config: OpenAICompatibleChatConfig, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
include the default object generation mode in config
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done. I'd like to define a type for the values 'json' | 'tool' | undefined
. It appears the first reference is in LanguageModelV1. Is it appropriate to add something there?
packages/openai-compatible/src/openai-compatible-completion-language-model.ts
Outdated
Show resolved
Hide resolved
packages/openai-compatible/src/openai-compatible-completion-language-model.ts
Outdated
Show resolved
Hide resolved
packages/openai-compatible/src/openai-compatible-completion-language-model.ts
Outdated
Show resolved
Hide resolved
packages/openai-compatible/src/openai-compatible-embedding-model.test.ts
Outdated
Show resolved
Hide resolved
packages/openai-compatible/src/openai-compatible-embedding-model.test.ts
Outdated
Show resolved
Hide resolved
packages/openai-compatible/src/openai-compatible-embedding-settings.ts
Outdated
Show resolved
Hide resolved
// TODO(shaper): Reconcile this with openai-error.ts. We derived from `xai`. | ||
|
||
export const openAICompatibleErrorDataSchema = z.object({ | ||
code: z.string(), | ||
error: z.string(), | ||
}); | ||
|
||
export type OpenAICompatibleErrorData = z.infer< | ||
typeof openAICompatibleErrorDataSchema | ||
>; | ||
|
||
export const openAICompatibleFailedResponseHandler = | ||
createJsonErrorResponseHandler({ | ||
errorSchema: openAICompatibleErrorDataSchema, | ||
errorToMessage: data => data.error, | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
make provider-specific. different providers have different structures. have default that matches openai
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can see a couple of different ways to do this. The error data schema is further embedded in the schemas used in the chat/completion model classes, so just making the failed response handler configurable isn't sufficient for the goal.
I have in mind to break this PR into two so that we can start landing parts and work more incrementally:
- initial openai-compatible package work
- togetherai package atop it
I expect we can land (1) tomorrow after I iterate on your next feedback. We can then put a focused change together to make the error schema/response handler configurable. And we can iterate on (2) until it supports the model/feature combos we feel is good enough for a first ship target.
6d44bf7
to
e326ae6
Compare
New dependencies detected. Learn more about Socket for GitHub ↗︎
|
Built on top of a new
openai-compatible
package for better code sharing as we add more top level providers that follow this pattern.