Pinned Loading
-
MicroLlama
MicroLlama PublicMicro Llama is a small Llama based model with 300M parameters trained from scratch with $500 budget
-
TinyLlama
TinyLlama PublicForked from jzhang38/TinyLlama
The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.