Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Search] [Playground]Improve follow up question flow #189848

Merged

Conversation

saarikabhasi
Copy link
Member

@saarikabhasi saarikabhasi commented Aug 3, 2024

Summary

Adds the query used for searching. This improves visibility to user on the query using for follow up questions.

Screenshot 2024-08-03 at 3 24 10 PM

Checklist

Delete any items that are not applicable to this PR.

@saarikabhasi saarikabhasi added release_note:skip Skip the PR/issue when compiling release notes backport:skip This commit does not require backporting v8.15.0 v8.16.0 labels Aug 3, 2024
@saarikabhasi saarikabhasi requested a review from a team as a code owner August 3, 2024 19:25
@@ -236,6 +236,10 @@ class ConversationalChainFn {
type: 'prompt_token_count',
count: getTokenEstimateFromMessages(msg),
});
data.appendMessageAnnotation({
type: 'search_query',
question,
Copy link
Member

@joemcelroy joemcelroy Aug 5, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This needs a tweak - the question const here is always the last message submission which isn't the question used for the search. We use the LLM model to transform the question when theres more than 1 message.

You want to modify the chain to capture the question. The runnable sequence runs each function in succession.

For example:

const answerChain = RunnableSequence.from([
      {
        context: RunnableSequence.from([(input) => input.question, retrievalChain]),
        question: (input) => input.question,
      },
      RunnableLambda.from((inputs) => {
        data.appendMessageAnnotation({
          type: 'search_query',
          question: inputs.question,
        });
        return inputs;
      }),
      RunnableLambda.from(clipContext(this.options?.rag?.inputTokensLimit, prompt, data)),
      RunnableLambda.from(registerContextTokenCounts(data)),
      prompt,
      this.options.model.withConfig({ metadata: { type: 'question_answer_qa' } }),
    ]);

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you should get something similar and not be exactly the same response as the last message

image

/>
{` `}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you keep this space for this

Screenshot 2024-08-05 at 14 34 33

Copy link
Member Author

@saarikabhasi saarikabhasi Aug 6, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Replaced with adding a space at the end of sentence in 4ab7ae9. With this I think we can avoid adding extra space in next line but rather included as part of sentence?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The challenge here is if the translation strings starts to remove the space

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reverted to original in 6a35192

@@ -198,6 +198,14 @@ class ConversationalChainFn {
context: RunnableSequence.from([(input) => input.question, retrievalChain]),
question: (input) => input.question,
},
RunnableLambda.from((inputs) => {
console.log("inputs.question",inputs.question)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

console.log

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch!

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reverted 07a7860

@@ -184,7 +184,7 @@ export function useAIAssistChat({
if (messagesRef.current.length === 0) return null;

const chatRequest: ChatRequest = {
messages: messagesRef.current,
messages: messagesRef.current.slice(0, messagesRef.current.length-1),
Copy link
Member

@joemcelroy joemcelroy Aug 6, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you double check what happens with errors.

Example:

  • send a message
  • get AI response
  • update Gen ai API key to be invalid
  • send a message
  • click re-generate

I think you need to remove system / ai recent messages upto the last human message. This means discarding all system or assistant messages after the last human message.

id="xpack.searchPlayground.chat.message.assistant.retrievalDocs"
defaultMessage="Grounding answer based on"
id="xpack.searchPlayground.chat.message.assistant.searchingQuestion"
defaultMessage='Searching "{question}"'
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@leemthompo what do you think about this message? Should this be something like "Searching for {question} in data sources" instead?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is what it looks like currently
Screenshot 2024-08-05 at 15 05 42

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@joemcelroy I could see where this just says "Searching..." without printing the rewritten query at all, might be cleaner

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

otherwise, yeh "Searching for" makes sense. Don't know if you need "in data sources" as sentence will get pretty long for longer queries

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think the reason why we want to provide visibility of the query thats actually been used to search for. The LLM here has transformed the conversation into a query so useful to know.

Copy link
Contributor

@leemthompo leemthompo Aug 7, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe could be more explicit in that case and have two separate bullets:

  • "Transformed query: {query} " 1
  • "Searching your data sources..."

Footnotes

  1. (could italicize the query to stand out)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added in 6a35192

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@joemcelroy might be nice to have two bullets like I mentioned to first establish the transformed query

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let me pick this up with Julian. Il approve for now not to block Saarika

@saarikabhasi
Copy link
Member Author

@elasticmachine merge upstream

@kibana-ci
Copy link
Collaborator

💛 Build succeeded, but was flaky

Failed CI Steps

Metrics [docs]

Async chunks

Total size of all lazy-loaded chunks that will be downloaded as the user navigates the app

id before after diff
searchPlayground 141.2KB 141.9KB +690.0B

History

To update your PR or re-run it, just comment with:
@elasticmachine merge upstream

@saarikabhasi saarikabhasi merged commit 5f49cc5 into elastic:main Aug 7, 2024
30 checks passed
@saarikabhasi saarikabhasi deleted the playground-improve-followip-qn branch August 7, 2024 20:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backport:skip This commit does not require backporting release_note:skip Skip the PR/issue when compiling release notes v8.15.0 v8.16.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants