Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there memory / conversation history support yet? #45

Closed
BoweFrankema opened this issue Oct 29, 2023 · 2 comments
Closed

Is there memory / conversation history support yet? #45

BoweFrankema opened this issue Oct 29, 2023 · 2 comments

Comments

@BoweFrankema
Copy link

BoweFrankema commented Oct 29, 2023

I've got my Q & A bot working with private data, works really well! I was wondering if there are any native methods yet to retrieve the chat history of a conversation yet? If not I'll probably add it to the system prompt but that is not ideal since it would quickly balloon the prompt size..

Would it make sense to basically use the Memory store and save a message conversation on the fly and vectorize that to use the the memory/history?

I also found this and maybe that is your recommended approach?

 public function answerQuestionFromChat(array $messages): StreamedResponse
    {
        // First we need to give the context to openAI with the good instructions
        $userQuestion = $messages[count($messages) - 1]->content;
        $systemMessage = $this->searchDocumentAndCreateSystemMessage($userQuestion);
        $this->openAIChat->setSystemMessage($systemMessage);

        // Then we can just give the conversation
        return $this->openAIChat->generateChatStream($messages);
    }

Thanks in advance for your help :-)

@MaximeThoonsen
Copy link
Collaborator

Hey @BoweFrankema ,
Did you have a look at the example qa-chatbot-laravel-vercel? The basic idea is that your frontend has the history of the chat and can send it to the back (see here).
It can work well for quite a lot of messages. After that you may need to summarize the conversation instead of sending all the messages or just send the last messages. They are multiple strategy for this.

Does this answer your question?

@BoweFrankema
Copy link
Author

I did not see that! This is basically exactly how I wanted to solve it.. I am already storing the conversation in a Laravel model so I'll follow that example.. maybe I'll count the total characters and then see if I can run a seperate API request to summarise once it hits a certain size to summarise. Thank you Maxime!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants