Skip to content

n_tokens <= n_batch Error #102

Closed Answered by OlleMattsson
hafsalm asked this question in Q&A
Discussion options

You must be logged in to vote

The LlamaContextOptions allows you to set the batch size.

https://withcatai.github.io/node-llama-cpp/api/type-aliases/LlamaContextOptions#type-alias-llamacontextoptions

eg.

const context = new LlamaContext({
      batchSize: 1024,
 });

Hope that helps. I unable to give you guidance as to how exactly implement that in your particular code.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@hafsalm
Comment options

Answer selected by hafsalm
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants