n_tokens <= n_batch Error #102
-
Hi, I'm using the From what I understand, this error typically occurs when the number of tokens exceeds the batch size. However, I'm not sure how to fix this in my code. Could anyone provide some guidance on what might be causing this error and how to resolve it? Here's my code(It's a langjs runnable sequence, works well with Ollama, but not working with node-llama-cpp):
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
The eg.
Hope that helps. I unable to give you guidance as to how exactly implement that in your particular code. |
Beta Was this translation helpful? Give feedback.
The
LlamaContextOptions
allows you to set the batch size.https://withcatai.github.io/node-llama-cpp/api/type-aliases/LlamaContextOptions#type-alias-llamacontextoptions
eg.
Hope that helps. I unable to give you guidance as to how exactly implement that in your particular code.