Skip to content

Catch llama.cpp errors in Node #93

Closed Answered by giladgd
fmaclen asked this question in Q&A
Discussion options

You must be logged in to vote

There's currently an issue with prompts that are longer than the batchSize; it'll be fixed as part of #85.
For a workaround for now, see #76

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by giladgd
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants