-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Search] [Playground]Improve follow up question flow #189848
[Search] [Playground]Improve follow up question flow #189848
Conversation
@@ -236,6 +236,10 @@ class ConversationalChainFn { | |||
type: 'prompt_token_count', | |||
count: getTokenEstimateFromMessages(msg), | |||
}); | |||
data.appendMessageAnnotation({ | |||
type: 'search_query', | |||
question, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This needs a tweak - the question const here is always the last message submission which isn't the question used for the search. We use the LLM model to transform the question when theres more than 1 message.
You want to modify the chain to capture the question. The runnable sequence runs each function in succession.
For example:
const answerChain = RunnableSequence.from([
{
context: RunnableSequence.from([(input) => input.question, retrievalChain]),
question: (input) => input.question,
},
RunnableLambda.from((inputs) => {
data.appendMessageAnnotation({
type: 'search_query',
question: inputs.question,
});
return inputs;
}),
RunnableLambda.from(clipContext(this.options?.rag?.inputTokensLimit, prompt, data)),
RunnableLambda.from(registerContextTokenCounts(data)),
prompt,
this.options.model.withConfig({ metadata: { type: 'question_answer_qa' } }),
]);
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/> | ||
{` `} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Replaced with adding a space at the end of sentence in 4ab7ae9. With this I think we can avoid adding extra space in next line but rather included as part of sentence?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The challenge here is if the translation strings starts to remove the space
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reverted to original in 6a35192
@@ -198,6 +198,14 @@ class ConversationalChainFn { | |||
context: RunnableSequence.from([(input) => input.question, retrievalChain]), | |||
question: (input) => input.question, | |||
}, | |||
RunnableLambda.from((inputs) => { | |||
console.log("inputs.question",inputs.question) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
console.log
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reverted 07a7860
@@ -184,7 +184,7 @@ export function useAIAssistChat({ | |||
if (messagesRef.current.length === 0) return null; | |||
|
|||
const chatRequest: ChatRequest = { | |||
messages: messagesRef.current, | |||
messages: messagesRef.current.slice(0, messagesRef.current.length-1), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could you double check what happens with errors.
Example:
- send a message
- get AI response
- update Gen ai API key to be invalid
- send a message
- click re-generate
I think you need to remove system / ai recent messages upto the last human message. This means discarding all system or assistant messages after the last human message.
id="xpack.searchPlayground.chat.message.assistant.retrievalDocs" | ||
defaultMessage="Grounding answer based on" | ||
id="xpack.searchPlayground.chat.message.assistant.searchingQuestion" | ||
defaultMessage='Searching "{question}"' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@leemthompo what do you think about this message? Should this be something like "Searching for {question} in data sources" instead?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@joemcelroy I could see where this just says "Searching..." without printing the rewritten query at all, might be cleaner
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
otherwise, yeh "Searching for" makes sense. Don't know if you need "in data sources" as sentence will get pretty long for longer queries
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i think the reason why we want to provide visibility of the query thats actually been used to search for. The LLM here has transformed the conversation into a query so useful to know.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
added in 6a35192
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@joemcelroy might be nice to have two bullets like I mentioned to first establish the transformed query
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let me pick this up with Julian. Il approve for now not to block Saarika
@elasticmachine merge upstream |
💛 Build succeeded, but was flaky
Failed CI StepsMetrics [docs]Async chunks
History
To update your PR or re-run it, just comment with: |
Summary
Adds the query used for searching. This improves visibility to user on the query using for follow up questions.
Checklist
Delete any items that are not applicable to this PR.