Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AI Chat: Handle page content changing during conversation #34706

Closed
petemill opened this issue Dec 5, 2023 · 9 comments · Fixed by brave/brave-core#21408
Closed

AI Chat: Handle page content changing during conversation #34706

petemill opened this issue Dec 5, 2023 · 9 comments · Fixed by brave/brave-core#21408

Comments

@petemill
Copy link
Member

petemill commented Dec 5, 2023

At the moment we only fetch page data at page load (or conversation start, whichever is later). This is what we provide to Leo context for page-connected Conversations. There are many use cases where page data meaningfully changes after page load time. For example:

  • JS-heavy pages which load their content later
  • Pages which load more content as the user scrolls
  • Page where there is no content until it is generated
    • Meetings where text transcript is built as the meeting progresses
    • Documents where content is written after page load

We can easily fetch the new content. For example whenever a new Conversation Entry is submitted by the user to the Conversation, we can update the page content before building the prompt.

Updating the associated page content for a Conversation in the middle of a Conversation would change the initial prompt sent for subsequent Entries, whilst keeping the same Conversation history that is appended to the prompt after the page content.

The potential complication is if this affects Leo responses negatively. For example in a Conversation where the user is asking Leo questions which Leo doesn't know the answer to because the information wasn't yet in the page content. When we send a subsequent message, we might send a prompt with new page content that does have the relevant information. Leo might get confused when it "sees" that it's already provided responses about not knowing that information.

Test Plan

  1. Open any web page with Leo-able content
  2. Start a Leo conversation, associated with page content
  3. Open devtools and input to JS console document.body.innerHTML = 'The answer is Red'
  4. In the same conversation, ask Leo "what is the answer, according to the page"
@petemill petemill added OS/Android Fixes related to Android browser functionality OS/Desktop labels Dec 5, 2023
@mrose17
Copy link
Member

mrose17 commented Dec 5, 2023

@mrose17 is watching...

@LorenzoMinto
Copy link
Member

Small context changes won’t probably cause much of a problem. When considering larger context changes, possible corrections, or the context switching entirely (would this happen in a web app for example?), then resubmitting the entire conversation on top of a significantly changed base context, I agree, could have unexpected consequences. Two considerations around this:

  • when the base context changes more than X%, we could notify the user that the page content has changed significantly and that the conversation history will be reset, which doesn’t mean necessarily that we would erase those interactions from the chat interface but we could simply not add them to the conversation history when submitting new queries. That way the past conversation won’t affect the new replies.
  • with more capable models (ie. Claude), we could try using a prompt that tell the model that the base context has changed and that previous conversations may be outdated, hoping that they will affect newer responses less. Although the first option seems the cleanest.

@mrose17
Copy link
Member

mrose17 commented Dec 5, 2023

For Brave Talk, this is a premium feature, so I think we should be using the Anthropic Claude model.

@petemill
Copy link
Member Author

petemill commented Dec 5, 2023

when the base context changes more than X%, we could notify the user that the page content has changed significantly and that the conversation history will be reset, which doesn’t mean necessarily that we would erase those interactions from the chat interface but we could simply not add them to the conversation history when submitting new queries. That way the past conversation won’t affect the new replies.

I had the same thought so I think that's wise. We'd need to introduce support for same-Conversation grouped Entries or linked Conversations or something like that.

with more capable models (ie. Claude), we could try using a prompt that tell the model that the base context has changed and that previous conversations may be outdated, hoping that they will affect newer responses less. Although the first option seems the cleanest.

Could we adjust the prompt to split the conversation up at the point where the context changed and provide the new content? Or maybe that would be too confusing for the model, with the repetition of the content and the increasing size of the prompt.

@kjozwiak
Copy link
Member

Adding QA/Blocked as the above will still need to be uplifted into 1.62.x once C121 is merged into 1.62.x.

@kjozwiak kjozwiak moved this from Uplift Pending to Done in Browser AI Jan 22, 2024
@kjozwiak
Copy link
Member

The above requires 1.62.149 or higher for 1.62.x verification 👍

@MadhaviSeelam
Copy link

Verification PASSED using

Brave | 1.62.149 Chromium: 121.0.6167.75 (Official Build) (64-bit)
-- | --
Revision | ff84587bd70af9fcbcbe59fc5194ca65082759c4
OS | Windows 11 Version 22H2 (Build 22621.3007)
  1. installed 1.62.149
  2. launched Brave (using --enable-logging=stderr --v=2 --env-leo=staging --env-ai-chat.bsg=dev --env-ai-chat-premium.bsg=dev)
  3. loaded https://techcrunch.com/
  4. clicked sidebar in the toolbar
  5. clicked on Leo in the sidebar
  6. clicked on Summarize this page
  7. clicked on Accept and begin
  8. opened Developer Tools
  9. clicked on Console
  10. input document.body.innerHTML = 'The answer is Red' and pressed enter
  11. confirmed the page turned to white with the above text
  12. in the Leo sidebar, entered according to the page, what is the answer?

Confirmed the output was, The answer, according to the page, is red

example example
image image

@Uni-verse Uni-verse added the QA/In-Progress Indicates that QA is currently in progress for that particular issue label Jan 24, 2024
@Uni-verse
Copy link
Contributor

Verified on Samsung Galaxy S21 using the following version:

Brave	1.62.152 Chromium: 121.0.6167.101 (Official Build) (64-bit) 
Revision	43e7bbb904a2fa762d73f3ac732effecba4c5362
OS	Android 13; Build/TP1A.220624.014; 33; REL

Used test plan in #34706 (comment)

  • Ensured Leo can handle changing page content during conversation.
  • Ensured "The answer according to the page, is red" is the output after injecting html from step 10 in test plan above.
Example Example Example
Screenshot 2024-01-24 at 5 40 01 PM Screenshot 2024-01-24 at 5 43 12 PM Screenshot 2024-01-24 at 5 43 44 PM

@Uni-verse Uni-verse added QA Pass - Android ARM and removed QA/In-Progress Indicates that QA is currently in progress for that particular issue labels Jan 24, 2024
@kjozwiak
Copy link
Member

Removing the above from #35625 as Leo is targeting 1.63.x for Android.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.

7 participants