Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: improve abstraction for external providers #123

Merged
merged 29 commits into from
Dec 15, 2024
Merged

Conversation

grabbou
Copy link
Collaborator

@grabbou grabbou commented Dec 14, 2024

Slightly better abstraction (but still light-weight) for other providers.
Support for Ollama (tho results with local LLMs are rather poor, I would love if someone could experiment with it).
We may need to tweak prompts on per model basis, which I would see as a task for the future.

Fixes #114

@grabbou grabbou marked this pull request as ready for review December 14, 2024 11:08
packages/framework/src/providers/ollama.ts Outdated Show resolved Hide resolved
packages/framework/src/providers/ollama.ts Outdated Show resolved Hide resolved
@pkarw
Copy link
Collaborator

pkarw commented Dec 14, 2024

Great job. Approved with just one question regarding the token usage stats within LLM response - if not now we probably will add it later nevertheless, I mean it's kind of standard per LLM responsy in any framework

@grabbou
Copy link
Collaborator Author

grabbou commented Dec 14, 2024

Right now, Ollama won't work due to ollama/ollama#8095, unless I misconfigured the request. I will try to replicate this behavior by tweaking the schema and we will see.

Copy link
Collaborator

@pkarw pkarw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approved with two minor suggestions

packages/framework/src/providers/openai_functions.ts Outdated Show resolved Hide resolved
packages/framework/src/providers/grok.ts Outdated Show resolved Hide resolved
@grabbou grabbou marked this pull request as draft December 15, 2024 18:54
@grabbou grabbou marked this pull request as ready for review December 15, 2024 19:51
@grabbou
Copy link
Collaborator Author

grabbou commented Dec 15, 2024

I am going to merge it, but let's test it thoroughly tomorrow before cutting release, to make sure it does work as expected!

@grabbou
Copy link
Collaborator Author

grabbou commented Dec 15, 2024

Grok working good!

@grabbou grabbou merged commit 30d5297 into main Dec 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

feat: support for local models (ollama)
2 participants