-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AI Chat: Handle page content changing during conversation #34706
Comments
@mrose17 is watching... |
Small context changes won’t probably cause much of a problem. When considering larger context changes, possible corrections, or the context switching entirely (would this happen in a web app for example?), then resubmitting the entire conversation on top of a significantly changed base context, I agree, could have unexpected consequences. Two considerations around this:
|
For Brave Talk, this is a premium feature, so I think we should be using the Anthropic Claude model. |
I had the same thought so I think that's wise. We'd need to introduce support for same-Conversation grouped Entries or linked Conversations or something like that.
Could we adjust the prompt to split the conversation up at the point where the context changed and provide the new content? Or maybe that would be too confusing for the model, with the repetition of the content and the increasing size of the prompt. |
Adding |
The above requires |
Verification
Confirmed the output was,
|
example | example |
---|---|
Verified on
Used test plan in #34706 (comment)
|
Removing the above from #35625 as |
At the moment we only fetch page data at page load (or conversation start, whichever is later). This is what we provide to Leo context for page-connected Conversations. There are many use cases where page data meaningfully changes after page load time. For example:
We can easily fetch the new content. For example whenever a new Conversation Entry is submitted by the user to the Conversation, we can update the page content before building the prompt.
Updating the associated page content for a Conversation in the middle of a Conversation would change the initial prompt sent for subsequent Entries, whilst keeping the same Conversation history that is appended to the prompt after the page content.
The potential complication is if this affects Leo responses negatively. For example in a Conversation where the user is asking Leo questions which Leo doesn't know the answer to because the information wasn't yet in the page content. When we send a subsequent message, we might send a prompt with new page content that does have the relevant information. Leo might get confused when it "sees" that it's already provided responses about not knowing that information.
Test Plan
document.body.innerHTML = 'The answer is Red'
The text was updated successfully, but these errors were encountered: