forked from ggerganov/llama.cpp
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
52 changed files
with
3,388 additions
and
1,554 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,25 @@ | ||
# CI | ||
|
||
In addition to [Github Actions](https://github.com/ggerganov/llama.cpp/actions) `llama.cpp` uses a custom CI framework: | ||
|
||
https://github.com/ggml-org/ci | ||
|
||
It monitors the `master` branch for new commits and runs the | ||
[ci/run.sh](https://github.com/ggerganov/llama.cpp/blob/master/ci/run.sh) script on dedicated cloud instances. This allows us | ||
to execute heavier workloads compared to just using Github Actions. Also with time, the cloud instances will be scaled | ||
to cover various hardware architectures, including GPU and Apple Silicon instances. | ||
|
||
Collaborators can optionally trigger the CI run by adding the `ggml-ci` keyword to their commit message. | ||
Only the branches of this repo are monitored for this keyword. | ||
|
||
It is a good practice, before publishing changes to execute the full CI locally on your machine: | ||
|
||
```bash | ||
mkdir tmp | ||
|
||
# CPU-only build | ||
bash ./ci/run.sh ./tmp/results ./tmp/mnt | ||
|
||
# with CUDA support | ||
GG_BUILD_CUDA=1 bash ./ci/run.sh ./tmp/results ./tmp/mnt | ||
``` |
Oops, something went wrong.