Skip to content

Is there a high-level chat option? #424

Answered by gjmulder
AnOpenSauceDev asked this question in Q&A
Discussion options

You must be logged in to vote

Check out oobabooga/text-generation-webui which can use llama-cpp-python. I've had a lot of success with llama-cpp-telegram_bot using Vicuna and WizardLM GGML models @TheBloke has converted.

The key is to understand how the models you are using were trained and to ensure your prompt template conforms to the training template.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@AnOpenSauceDev
Comment options

Answer selected by AnOpenSauceDev
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #423 on June 26, 2023 07:42.