v3.0.0-beta.17
Pre-release
Pre-release
3.0.0-beta.17 (2024-04-24)
Bug Fixes
FunctionaryChatWrapper
bugs (#205) (ef501f9)- function calling syntax bugs (#205) ([ef501f9]
- show
GPU layers
in theModel
line in CLI commands (#205) ([ef501f9] - refactor: rename
LlamaChatWrapper
toLlama2ChatWrapper
Features
- Llama 3 support (#205) (ef501f9)
--gpu
flag in generation CLI commands (#205) (ef501f9)specialTokens
parameter onmodel.detokenize
(#205) (ef501f9)
Shipped with llama.cpp
release b2717
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)