Demo: Onboard.bot
OnboardBot allows you to direct Chatbot conversations towards collecting data defined by flexible YAML files.
For example, the following YAML will direct the Chatbot to collect Name, email and the user's desired property type:
models:
- name: Buyer
class_type: Question
fields:
- name: name
- name: email_address
- name: PropertyStyle
class_type: Choice
description: What is your desired property type?
fields:
- name: townhouse
- name: condo
- name: single_family
- name: multi_family
- name: land
- name: other
- Currently , your models config MUST start with a
Question
model type. Hoping to fix this soon.
git clone [email protected]:kvnn/OnboardBot.git
cd OnboardBot/src/server
- (optional)
python3 -m venv .venv && source .venv/bin/activate
) pip install -r requirements.txt
- Create an
.env
file in the server directory. It will require anOPENROUTER_API_KEY
OR anOPENAI_API_KEY
.
CHAINLIT_AUTH_SECRET="YOUR SECRET KEY"
OPENROUTER_API_KEY="YOUR OPENROUTER_API_KEY"
# Optional
DB_ENGINE_URL="postgresql://postgres:{password}@{host}:5432/postgres"
DB_ENGINE_URL
is optional, but without it the data retrieved from the chats will not be readily available to you.
Its also trivial to send the data to a Slack channel, for example. Built-in support for this is coming soon. Here is an example of an output:
buyer:
email_address: [email protected]
name: Kevin
desiredproperty:
number_of_bathrooms: 2.5
number_of_bedrooms: 3
multifamilydetails:
business headquarters: false
grandparents quarters: true
off-grid compound: false
vacation rental: false
musthave:
must_haves: green gardens , good aesthetics
propertystyle:
condo: false
land: false
multi_family: true
other: false
single_family: false
townhouse: false
- you can modify
config.yml
to point to a new model definition to fit your use case - for now, it will be best to copy
bots/realty.yml
, and experiment with modifications - there are three model types:
Question
: an open-ended question . The user's text answer is saved to the model instance value.Choice
: a single choice . The user's choice is saved to the model instance value.MultiChoice
: coming soon. its working in my fork of chainlit , and the PR should be merged soon . See Chainlit/chainlit#965
- there is
conditions
logic. SeeMultiFamilyDetails
inbots/realty.yml
.for_choice
is a pointer to another model. A model with this config will show only if that model has a value matched in thefor_value
key of the config. So, You can show / hide questions based on the values of previous questions.
The OnboardBot server is a simple, opinionated and flexible Chainlit project.
So you run it like chainlit run app.py
from OnboardBot/src/server
.
This will open a browser tab running the chatbot interface.
OnboardBot (via prompts.py
) will use the data model given in config.yml
to collect data from the user in a conversational, helpful manner.
Notice that conditional logic, in the case of notification_preference_sms_email_whatsapp_or_combination
, is handled in the field name itself. The LLMs (including
Mixtral-8x7B-Instruct
which is 100x cheaper than gpt-4) handle this exactly how we'd wish. The aim is to push the simplicity as far as possible before implementing logic chains in the models.
Single choice, Multiple choice and conditional logic is supported.
OnboardBot does not use a custom client. It uses the default chainlit
UI.
Everything OnboardBot wishes to achieve is done via chat.
create an issue or email [email protected]
- Explain and give good examples of Choice , MultiChoice and conditionals
- Support OpenAI (you'll need to modify
llm.py
to do this for now. Just remove thebase_url
property fromllm_chat_client_async
and change the default models to an OpenAI model. - Support duplicate key names in data classes (the LLM needs to understand they are unique, even with same key)
- Decide: should
User
andGoal
may need to be first-class models, and should prompt be different for definitions of data models versus class that inherit from them? there is a tricky truth here that we are defining pydantic models for LLM's understanding, when we are conditioned to write data models to make our code more meaningful. So, we may need utility classes (or first-class classes?) that support enabled_models that are intended for the LLMs understanding.