Skip to content

Actions: Mozilla-Ocho/llamafile

All workflows

Actions

Loading...
Loading

Showing runs from all workflows
549 workflow runs
549 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Fix additional review issues with GPU config
CI #86: Commit 5dff322 pushed by jart
January 8, 2024 02:19 6m 1s main
January 8, 2024 02:19 6m 1s
Fix indentation in log.c
CI #85: Commit 3828eb8 pushed by jart
January 8, 2024 01:45 4m 39s main
January 8, 2024 01:45 4m 39s
Get AMD GPU support working on Linux
CI #84: Commit 6559da6 pushed by jart
January 8, 2024 01:08 6m 44s main
January 8, 2024 01:08 6m 44s
Clean up more small issues
CI #83: Commit 7be970a pushed by jart
January 7, 2024 04:26 5m 30s main
January 7, 2024 04:26 5m 30s
Fix download-cosmocc.sh on Mac (#177)
CI #81: Commit 8d5e91f pushed by jart
January 6, 2024 23:55 4m 19s main
January 6, 2024 23:55 4m 19s
Clean up the GPU support code
CI #79: Commit ef10913 pushed by jart
January 6, 2024 23:17 7m 58s main
January 6, 2024 23:17 7m 58s
Restore MSVC compatibility
CI #78: Commit 0d10a33 pushed by jart
January 6, 2024 20:49 5m 49s main
January 6, 2024 20:49 5m 49s
Fix support for multiple GPUs
CI #77: Commit 4616816 pushed by jart
January 6, 2024 14:33 5m 30s main
January 6, 2024 14:33 5m 30s
Make CLIP GPU acceleration work on UNIX / Windows
CI #76: Commit 20d5f46 pushed by jart
January 6, 2024 05:40 6m 1s main
January 6, 2024 05:40 6m 1s
Upgrade to cosmocc v3.2.1
CI #75: Commit 2b1fef9 pushed by jart
January 5, 2024 23:56 6m 57s main
January 5, 2024 23:56 6m 57s
Remove -main from llamafile names
CI #74: Commit a6d041a pushed by jart
January 5, 2024 18:52 5m 28s main
January 5, 2024 18:52 5m 28s
Release llamafile v0.5
CI #73: Commit ef83e2b pushed by jart
January 5, 2024 17:50 5m 0s main
January 5, 2024 17:50 5m 0s
Make default thread count capped at 12 maximum
CI #72: Commit 7843183 pushed by jart
January 5, 2024 12:27 4m 34s main
January 5, 2024 12:27 4m 34s
Don't use ggml-backend on non-Apple platforms
CI #71: Commit 922c4f1 pushed by jart
January 5, 2024 12:21 4m 7s main
January 5, 2024 12:21 4m 7s
Make --log-disable work better
CI #70: Commit 7d23bc9 pushed by jart
January 5, 2024 11:13 5m 55s main
January 5, 2024 11:13 5m 55s
Restore Apple Metal GPU support
CI #69: Commit 912e616 pushed by jart
January 5, 2024 10:12 6m 5s main
January 5, 2024 10:12 6m 5s
Restore Apple Metal GPU support
CI #68: Commit d8ad2ae pushed by jart
January 5, 2024 10:00 3m 59s main
January 5, 2024 10:00 3m 59s
Sync with llama.cpp upstream
CI #67: Commit 2e276a1 pushed by jart
January 5, 2024 09:15 4m 54s main
January 5, 2024 09:15 4m 54s
Make JSON server crashes more informative
CI #66: Commit dd4c9d7 pushed by jart
January 5, 2024 03:56 4m 32s main
January 5, 2024 03:56 4m 32s
Mention that lower perplexity is better
CI #65: Commit 82f28fe pushed by jart
January 5, 2024 02:57 6m 24s main
January 5, 2024 02:57 6m 24s
Add example using llamafile-perplexity to manual
CI #64: Commit 9e60f5c pushed by jart
January 5, 2024 02:53 5m 24s main
January 5, 2024 02:53 5m 24s
Embed man page into --help flag of each program
CI #63: Commit 156f0a6 pushed by jart
January 5, 2024 02:43 8m 45s main
January 5, 2024 02:43 8m 45s
Upgrade to cosmocc 3.2
CI #62: Commit ce4aac6 pushed by jart
January 4, 2024 22:07 8m 22s main
January 4, 2024 22:07 8m 22s
Avoid activating GPU when we can't use the GPU
CI #61: Commit 01b9aaf pushed by jart
January 4, 2024 16:14 8m 39s main
January 4, 2024 16:14 8m 39s
Make --nocompile flag work as intended
CI #60: Commit 474b44f pushed by jart
January 4, 2024 14:43 5m 5s main
January 4, 2024 14:43 5m 5s
ProTip! You can narrow down the results and go further in time using created:<2024-01-04 or the other filters available.