-
-
Notifications
You must be signed in to change notification settings - Fork 93
Issues: withcatai/node-llama-cpp
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Supporting minItems and others in LlamaJsonSchemaGrammar
new feature
New feature or request
requires triage
Requires triaging
#384
opened Nov 15, 2024 by
TrevorSundberg
1 of 5 tasks
NoBinaryFoundError for Windows in Electron when Upgrading from 3.0.0-beta44 -> 3.2.0
bug
Something isn't working
requires triage
Requires triaging
#381
opened Nov 5, 2024 by
bitterspeed
5 tasks
When I use node-llama-cpp to run inference, cloudrun fails with a 503 error
bug
Something isn't working
requires triage
Requires triaging
#277
opened Jul 30, 2024 by
MarioSimou
1 of 3 tasks
Support file based prompt caching
new feature
New feature or request
roadmap
Part of the roadmap for node-llama-cpp (https://github.com/orgs/withcatai/projects/1)
#180
opened Mar 16, 2024 by
StrangeBytesDev
3 tasks
feat: pass an image as part of the evaluation
new feature
New feature or request
roadmap
Part of the roadmap for node-llama-cpp (https://github.com/orgs/withcatai/projects/1)
#88
opened Nov 5, 2023 by
giladgd
feat: use multiple machines for evaluating large models that require a lot of RAM
new feature
New feature or request
roadmap
Part of the roadmap for node-llama-cpp (https://github.com/orgs/withcatai/projects/1)
#86
opened Nov 5, 2023 by
giladgd
ProTip!
Adding no:label will show everything without a label.