Releases: withcatai/node-llama-cpp
Releases · withcatai/node-llama-cpp
v2.6.0
2.6.0 (2023-10-09)
Features
- adapt to
llama.cpp
changes (#60) (3400fce)
- add repeat penalty support (#60) (3400fce)
- improve grammar support (#60) (3400fce)
- better API for customizing context and chat session options, while maintaining compatibility with existing API (#60) (3400fce)
- git release bundle (#61) (ada896b)
- new documentation website (#62) (c0deffd)
- improve
chat
command (#62) (c0deffd)
v2.5.1
2.5.1 (2023-09-27)
Bug Fixes
- make disabling metal on build work (#55) (03ec18b)
v2.5.0
2.5.0 (2023-09-26)
Bug Fixes
- adapt to
llama.cpp
interface change (#49) (9db72b0)
Features
- add
FalconChatPromptWrapper
(#53) (656bf3c)
- fall back to build from source if prebuilt binary loading fails (#54) (d99e3b0)
- load conversation history into a
LlamaChatSession
(#51) (4e274ce)
- only build one binary for all node versions (#50) (1e617cd)
v2.4.0
2.4.0 (2023-09-09)
Features
v2.3.2
2.3.2 (2023-09-02)
Bug Fixes
- load image urls properly also outside GitHub (#35) (cf1f5f1)
v2.3.0
2.3.0 (2023-09-02)
Bug Fixes
- handle stop words remainder properly in a chat session (#32) (9bdef11)
- move
default
export to be the last one in package.json
(#31) (dd49959)
Features
v2.2.0
2.2.0 (2023-09-01)
Features
- export class options types (#29) (74be398)
- improve error message when
llama.cpp
source is not downloaded (#27) (7837af7)
- make contributions and support more efficient via GitHub templates (#28) (5fc0d18)
v2.1.2
2.1.2 (2023-08-28)
Bug Fixes