llama : add missing LLAMA_API for llama_chat_builtin_templates (#10636) #40
build.yml
on: push
Matrix: windows-2019-cmake-cuda
Matrix: windows-latest-cmake-hip-release
Matrix: windows-latest-cmake
macOS-latest-cmake-arm64
13m 1s
macOS-latest-cmake-x64
4m 22s
ubuntu-latest-cmake
2m 36s
macOS-latest-cmake
12m 2s
ubuntu-latest-cmake-rpc
2m 19s
ubuntu-22-cmake-vulkan
2m 44s
ubuntu-22-cmake-hip
19m 15s
ubuntu-22-cmake-musa
11m 51s
ubuntu-22-cmake-sycl
4m 45s
ubuntu-22-cmake-sycl-fp16
5m 6s
macOS-latest-cmake-ios
1m 4s
macOS-latest-cmake-tvos
1m 37s
ubuntu-latest-cmake-cuda
11m 6s
windows-latest-cmake-sycl
9m 59s
windows-latest-cmake-hip
23m 33s
ios-xcode-build
1m 13s
android-build
6m 3s
Matrix: ubuntu-latest-cmake-sanitizer
Matrix: windows-msys2
release
1m 51s
Annotations
1 error and 11 warnings
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
cudart-llama-bin-win-cu11.7-x64.zip
|
303 MB |
|
cudart-llama-bin-win-cu12.4-x64.zip
|
372 MB |
|
llama-bin-macos-arm64.zip
|
52.1 MB |
|
llama-bin-macos-x64.zip
|
53.6 MB |
|
llama-bin-ubuntu-x64.zip
|
58.8 MB |
|
llama-bin-win-avx-x64.zip
|
8.53 MB |
|
llama-bin-win-avx2-x64.zip
|
8.53 MB |
|
llama-bin-win-avx512-x64.zip
|
8.54 MB |
|
llama-bin-win-cu11.7-x64.zip
|
145 MB |
|
llama-bin-win-cu12.4-x64.zip
|
145 MB |
|
llama-bin-win-hip-x64-gfx1030.zip
|
228 MB |
|
llama-bin-win-hip-x64-gfx1100.zip
|
230 MB |
|
llama-bin-win-hip-x64-gfx1101.zip
|
230 MB |
|
llama-bin-win-kompute-x64.zip
|
8.83 MB |
|
llama-bin-win-llvm-arm64.zip
|
10.1 MB |
|
llama-bin-win-msvc-arm64.zip
|
12.8 MB |
|
llama-bin-win-noavx-x64.zip
|
8.51 MB |
|
llama-bin-win-openblas-x64.zip
|
19.5 MB |
|
llama-bin-win-sycl-x64.zip
|
89.2 MB |
|
llama-bin-win-vulkan-x64.zip
|
9.29 MB |
|