Releases: withcatai/node-llama-cpp
Releases · withcatai/node-llama-cpp
v3.0.0-beta.5
3.0.0-beta.5 (2024-01-24)
Bug Fixes
Features
Shipped with llama.cpp
release b1961
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.4
3.0.0-beta.4 (2024-01-21)
Features
Shipped with llama.cpp
release b1892
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.3
3.0.0-beta.3 (2024-01-21)
Features
Shipped with llama.cpp
release b1892
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v2.8.5
v3.0.0-beta.2
3.0.0-beta.2 (2024-01-20)
Bug Fixes
- adapt to breaking changes of
llama.cpp
(#117) (595a6bc) - threads parameter (#139) (5fcdf9b)
- disable Metal for
x64
arch by default (#139) (5fcdf9b)
Features
- function calling (#139) (5fcdf9b)
- chat syntax aware context shifting (#139) (5fcdf9b)
- stateless
LlamaChat
(#139) (5fcdf9b) - improve chat wrapper (#139) (5fcdf9b)
LlamaText
util (#139) (5fcdf9b)- show
llama.cpp
release in GitHub releases (#142) (36c779d)
Shipped with llama.cpp
release b1892
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
(learn more)
v2.8.4
v2.8.3
v2.8.2
v2.8.1
v3.0.0-beta.1
3.0.0-beta.1 (2023-11-26)
Features
BREAKING CHANGES
- completely new API (docs will be updated before a stable version is released)