Skip to content

Local chatbot with Gradio frontend and ollama Llama3 LLM backend integrated using LangChain with storing of chat history in SQLite

Notifications You must be signed in to change notification settings

mirec84/ollama-chatbot-gradio

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Local Llama 3 UI

Chat UI for local offline Llama3 Model to chat with.

Architecture

Alt text

Gradio UI

Alt text

Prerequisites

Download Llama 3 model

In your terminal:

ollama pull llama3:latest

Install Python libraries

In your terminal:

pip3 install -r requirements.txt

Run the Gradio app

python3 run gradio_app_v1.py

Create Shell alias

Add into your bashrc or zshrc file:

alias llama='cd ~/llama3_local; python3 run gradio_app_v1.py'

NOTE: update the cd ~/llama3_local with the path, where you've saved this project.

Run the shell alias to call it from any directory

llama

About

Local chatbot with Gradio frontend and ollama Llama3 LLM backend integrated using LangChain with storing of chat history in SQLite

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages