-
-
Notifications
You must be signed in to change notification settings - Fork 93
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
docs(README): improve contributing docs (#42)
* docs: improve contributing docs * fix: make dev env work on Windows * docs: add mdn links to generated documentation
- Loading branch information
Showing
5 changed files
with
385 additions
and
18 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,59 @@ | ||
# How to contribute to `node-llama-cpp` | ||
This document describes how to set up your development environment to contribute to `node-llama-cpp`. | ||
|
||
## Prerequisites | ||
- [Git](https://git-scm.com/). [GitHub's Guide to Installing Git](https://help.github.com/articles/set-up-git) is a good source of information. | ||
- [Node.js](https://nodejs.org/en/) (v18 or higher) | ||
- [cmake dependencies](https://github.com/cmake-js/cmake-js#installation:~:text=projectRoot/build%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%5Bstring%5D-,Requirements%3A,-CMake) - make sure the required dependencies of `cmake` are installed on your machine. More info is available [here](https://github.com/cmake-js/cmake-js#installation:~:text=projectRoot/build%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%5Bstring%5D-,Requirements%3A,-CMake) (you don't necessarily have to install `cmake`, just the dependencies) | ||
|
||
## Setup | ||
1. [Fork `node-llama-cpp` repo](https://github.com/withcatai/node-llama-cpp/fork) | ||
2. Clone your forked repo to your local machine | ||
3. Install dependencies: | ||
```bash | ||
npm install | ||
``` | ||
4. Build the CLI, use the CLI to clone the latest release of `llama.cpp`, and build it from source: | ||
```bash | ||
npm run dev:setup | ||
``` | ||
|
||
## Development | ||
Whenever you add a new functionality to `node-llama-cpp`, consider improving the CLI to reflect this change. | ||
|
||
To test whether you local setup works, download a model and try using it with the `chat` command. | ||
|
||
### Get a model file | ||
We recommend you to get a GGUF model from the [TheBloke on Hugging Face](https://huggingface.co/TheBloke?search_models=GGUF). | ||
|
||
We recommend you to start by getting a small model that doesn't have a lot of parameters just to ensure that your setup works, so try downloading a `7B` parameters model first (search for models with both `7B` and `GGUF` in their name). | ||
For improved download speeds, you can use [`ipull`](https://www.npmjs.com/package/ipull) to download the model: | ||
```bash | ||
npx ipull <model-file-ul> | ||
``` | ||
### Validate your setup by chatting with a model | ||
To validate that your setup works, run the following command to chat with the model you downloaded: | ||
```bash | ||
npx run dev:build; node ./dist/cli/cli.js chat --model <path-to-model-file-on-your-computer> | ||
``` | ||
Try telling the model `Hi there` and see how it reacts. Any response from the model means that your setup works. | ||
If the response looks weird or doesn't make sense, try using a different model. | ||
|
||
If the model doesn't stop generating output, try using a different chat wrapper. For example: | ||
```bash | ||
npx run dev:build; node ./dist/cli/cli.js chat --wrapper llamaChat --model <path-to-model-file-on-your-computer> | ||
``` | ||
> **Important:** Make sure you always run `npm run dev:build` before running the CLI to make sure that your code changes are reflected in the CLI. | ||
### Debugging | ||
To run a chat session with a debugger, configure your IDE to run the following command with a debugger: | ||
```bash | ||
node --loader ts-node/esm ./src/cli/cli.ts chat --model <path-to-model-file-on-your-computer> | ||
``` | ||
## Opening a pull request | ||
To open a pull request, read the [CONTRIBUTING.md](https://github.com/withcatai/node-llama-cpp/blob/master/CONTRIBUTING.md) guidelines. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.