LLM-based chatbot that queries and visualizes KGX
nodes and edges TSV files loaded into either DuckDB
(default) or neo4j
database backend.
LLM Provider | Models |
---|---|
OpenAI | - gpt-4o-2024-08-06 - gpt-4o-mini - gpt-4o-mini-2024-07-18 - gpt-4o-2024-05-13 - gpt-4o - gpt-4-turbo-2024-04-09 - gpt-4-turbo - gpt-4-turbo-preview |
Anthropic | - claude-3-5-sonnet-20240620 - claude-3-opus-20240229 - claude-3-sonnet-20240229 - claude-3-haiku-20240307 |
Ollama | - llama3.1 |
LBNL-hosted models via CBORG | - lbl/cborg-chat:latest - lbl/cborg-chat-nano:latest - lbl/cborg-coder:latest - openai/chatgpt:latest - anthropic/claude:latest - google/gemini:latest |
-
OpenAI: Ensure
OPENAI_API_KEY
is set as an environment variable. -
Anthropic: Ensure
ANTHROPIC_API_KEY
is set as an environment variable. -
Ollama: Better results if the
llama 3.1 405b
model is used. Needs GPU.- No API key required.
- Download the application from here and install it locally.
- Get any model of your choice but make sure the model has the
Tools
badge for it to work. Here's an example:ollama run llama3.1:405b
-
Models hosted by Lawrence Berkeley National Laboratory via CBORG: Ensure
CBORG_API_KEY
is set as an environment variable.- The list of modes can be found here listed under "LBNL_Hosted Models".
One quick way is
export OPENAI_API_KEY=XXXXXX
export ANTHROPIC_API_KEY=XXXXX
export CBORG_API_KEY=XXXX
But if you want these to persist permanently
vi ~/.bash_profile
OR
vi ~/.bashrc
Add the 2 lines exporting the variables above and then
source ~/.bash_profile
OR
source ~/.bashrc
- Install Neo4j desktop from here.
- Create a new project and database, then start it.
- Install the APOC plugin in Neo4j Desktop.
- Update settings to match
neo4j_db_settings.conf
.
- Clone this repository.
- Create a virtual environment and install dependencies:
cd kg-chat pip install poetry poetry install
- Replace
data/nodes.tsv
anddata/edges.tsv
with desired KGX files if needed.
pip install kg-chat
OR
poetry add kg-chat@latest
- DuckDB [default]
- Neo4j
-
Import KG: Load nodes and edges into a database (default: duckdb).
poetry run kg import --data-dir data
-
List LLM models: List the LLM models supported.
poetry run kg list-models
-
Test Query: Run a test query.
⚠️ --data-dir
is a required parameter for all commands. This is the path for the directory which contains the nodes.tsv and edges.tsv file. The filenames are expected to be exactly that.poetry run kg test-query --data-dir data
-
QnA: Ask questions about the data.
poetry run kg qna "how many nodes do we have here?" --data-dir data
-
Chat: Start an interactive chat session.
poetry run kg chat --data-dir data
-
App: Deploy a local web application.
poetry run kg app --data-dir data
Use show me
in prompts for KG visualization.
This cookiecutter project was developed from the monarch-project-template template and will be kept up-to-date using cruft.