Chat UI for local offline Llama3 Model to chat with.
- Install Ollama - https://ollama.com/
- Install Python 3
In your terminal:
ollama pull llama3:latest
In your terminal:
pip3 install -r requirements.txt
python3 run gradio_app_v1.py
Add into your bashrc
or zshrc
file:
alias llama='cd ~/llama3_local; python3 run gradio_app_v1.py'
NOTE: update the cd ~/llama3_local
with the path, where you've saved this project.
llama