Releases: withcatai/node-llama-cpp
v3.0.0-beta.12
3.0.0-beta.12 (2024-02-24)
Bug Fixes
Features
Shipped with llama.cpp
release b2254
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v2.8.8
v3.0.0-beta.11
3.0.0-beta.11 (2024-02-18)
Features
- completion and infill (#164) (ede69c1)
- support configuring more options for
getLlama
when using"lastBuild"
(#164) (ede69c1) - export
resolveChatWrapperBasedOnWrapperTypeName
(#165) (624fa30)
Shipped with llama.cpp
release b2174
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v2.8.7
v3.0.0-beta.10
3.0.0-beta.10 (2024-02-11)
Features
- get VRAM state (#161) (46235a2)
chatWrapper
getter on aLlamaChatSession
(#161) (46235a2)- minP support (#162) (47b476f)
Shipped with llama.cpp
release b2127
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.9
3.0.0-beta.9 (2024-02-05)
Bug Fixes
- don't block a node process from exiting (#157) (74fb35c)
- respect
logLevel
andlogger
params when using"lastBuild"
(#157) (74fb35c) - print logs on the same event loop cycle (#157) (74fb35c)
Shipped with llama.cpp
release b2074
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.8
3.0.0-beta.8 (2024-02-05)
Bug Fixes
Shipped with llama.cpp
release b2060
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.7
3.0.0-beta.7 (2024-02-05)
Bug Fixes
Shipped with llama.cpp
release b2060
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v2.8.6
v3.0.0-beta.6
3.0.0-beta.6 (2024-02-04)
Bug Fixes
Features
- manual binding loading (#153) (0e4b8d2)
- log settings (#153) (0e4b8d2)
- ship CUDA prebuilt binaries (#153) (0e4b8d2)
Shipped with llama.cpp
release b2060
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)