Skip to content

Actions: cpumaxx/llama.cpp

Python check requirements.txt

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
2 workflow runs
2 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

readme : add note that LLaMA 3 is not supported with convert.py (#7065)
Python check requirements.txt #2: Commit ca36326 pushed by cpumaxx
May 5, 2024 06:41 5m 47s master
May 5, 2024 06:41 5m 47s
Fix more int overflow during quant (PPL/CUDA). (#6563)
Python check requirements.txt #1: Commit e00b4a8 pushed by cpumaxx
April 28, 2024 22:47 5m 36s master
April 28, 2024 22:47 5m 36s