Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Search] [Playground]Improve follow up question flow #189848

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ describe('AssistantMessage component', () => {
createdAt: new Date(),
citations: [],
retrievalDocs: [{ content: '', metadata: { _id: '1', _index: 'index', _score: 1 } }],
inputTokens: { context: 20, total: 10 },
inputTokens: { context: 20, total: 10, searchQuery: 'Test question' },
};

it('renders message content correctly', () => {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -53,53 +53,79 @@ export const AssistantMessage: React.FC<AssistantMessageProps> = ({ message }) =
return (
<>
{!!retrievalDocs?.length && (
<EuiComment
username={username}
timelineAvatar="dot"
data-test-subj="retrieval-docs-comment"
eventColor="subdued"
css={{
'.euiAvatar': { backgroundColor: euiTheme.colors.ghost },
'.euiCommentEvent': {
border: euiTheme.border.thin,
borderRadius: euiTheme.border.radius.medium,
},
}}
event={
<>
<>
<EuiComment
username={username}
timelineAvatar="dot"
data-test-subj="assistant-message-searching"
eventColor="subdued"
css={{
'.euiAvatar': { backgroundColor: euiTheme.colors.ghost },
'.euiCommentEvent': {
border: euiTheme.border.thin,
borderRadius: euiTheme.border.radius.medium,
},
}}
event={
<EuiText size="s">
<p>
<FormattedMessage
id="xpack.searchPlayground.chat.message.assistant.retrievalDocs"
defaultMessage="Grounding answer based on"
id="xpack.searchPlayground.chat.message.assistant.searchingQuestion"
defaultMessage='Searching for "{question}"'
values={{ question: inputTokens.searchQuery }}
/>
{` `}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you keep this space for this

Screenshot 2024-08-05 at 14 34 33

Copy link
Member Author

@saarikabhasi saarikabhasi Aug 6, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Replaced with adding a space at the end of sentence in 4ab7ae9. With this I think we can avoid adding extra space in next line but rather included as part of sentence?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The challenge here is if the translation strings starts to remove the space

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reverted to original in 6a35192

</p>
</EuiText>
}
/>
<EuiComment
username={username}
timelineAvatar="dot"
data-test-subj="retrieval-docs-comment"
eventColor="subdued"
css={{
'.euiAvatar': { backgroundColor: euiTheme.colors.ghost },
'.euiCommentEvent': {
border: euiTheme.border.thin,
borderRadius: euiTheme.border.radius.medium,
},
}}
event={
<>
<EuiText size="s">
<p>
<FormattedMessage
id="xpack.searchPlayground.chat.message.assistant.retrievalDocs"
defaultMessage="Grounding answer based on"
/>
{` `}
</p>
</EuiText>

<EuiButtonEmpty
css={{ blockSize: 'auto' }}
size="s"
flush="left"
data-test-subj="retrieval-docs-button"
onClick={() => setIsDocsFlyoutOpen(true)}
>
<FormattedMessage
id="xpack.searchPlayground.chat.message.assistant.retrievalDocButton"
defaultMessage="{count} document sources"
values={{ count: retrievalDocs.length }}
/>
</EuiButtonEmpty>
<EuiButtonEmpty
css={{ blockSize: 'auto' }}
size="s"
flush="left"
data-test-subj="retrieval-docs-button"
onClick={() => setIsDocsFlyoutOpen(true)}
>
<FormattedMessage
id="xpack.searchPlayground.chat.message.assistant.retrievalDocButton"
defaultMessage="{count} document sources"
values={{ count: retrievalDocs.length }}
/>
</EuiButtonEmpty>

{isDocsFlyoutOpen && (
<RetrievalDocsFlyout
onClose={() => setIsDocsFlyoutOpen(false)}
retrievalDocs={retrievalDocs}
/>
)}
</>
}
/>
{isDocsFlyoutOpen && (
<RetrievalDocsFlyout
onClose={() => setIsDocsFlyoutOpen(false)}
retrievalDocs={retrievalDocs}
/>
)}
</>
}
/>
</>
)}
{retrievalDocs?.length === 0 && (
<EuiComment
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -184,7 +184,7 @@ export function useAIAssistChat({
if (messagesRef.current.length === 0) return null;

const chatRequest: ChatRequest = {
messages: messagesRef.current,
messages: messagesRef.current.slice(0, messagesRef.current.length - 1),
options,
data,
};
Expand Down
4 changes: 3 additions & 1 deletion x-pack/plugins/search_playground/public/types.ts
Original file line number Diff line number Diff line change
Expand Up @@ -100,8 +100,9 @@ export interface AnnotationDoc {
}

export interface AnnotationTokens {
type: 'prompt_token_count' | 'context_token_count' | 'context_clipped';
type: 'prompt_token_count' | 'context_token_count' | 'context_clipped' | 'search_query';
count: number;
question?: string;
}

export interface Doc {
Expand All @@ -117,6 +118,7 @@ export interface AIMessage extends Message {
context: number;
total: number;
contextClipped?: number;
searchQuery: string;
};
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,9 @@ export const transformFromChatMessages = (messages: UseChatHelpers['messages']):
contextClipped: annotations?.find(
(annotation): annotation is AnnotationTokens => annotation.type === 'context_clipped'
)?.count,
searchQuery: annotations?.find(
(annotation): annotation is AnnotationTokens => annotation.type === 'search_query'
)?.question,
},
} as AIMessage;
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -198,6 +198,13 @@ class ConversationalChainFn {
context: RunnableSequence.from([(input) => input.question, retrievalChain]),
question: (input) => input.question,
},
RunnableLambda.from((inputs) => {
data.appendMessageAnnotation({
type: 'search_query',
question: inputs.question,
});
return inputs;
}),
RunnableLambda.from(clipContext(this.options?.rag?.inputTokensLimit, prompt, data)),
RunnableLambda.from(registerContextTokenCounts(data)),
prompt,
Expand Down Expand Up @@ -236,6 +243,10 @@ class ConversationalChainFn {
type: 'prompt_token_count',
count: getTokenEstimateFromMessages(msg),
});
data.appendMessageAnnotation({
type: 'search_query',
question,
Copy link
Member

@joemcelroy joemcelroy Aug 5, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This needs a tweak - the question const here is always the last message submission which isn't the question used for the search. We use the LLM model to transform the question when theres more than 1 message.

You want to modify the chain to capture the question. The runnable sequence runs each function in succession.

For example:

const answerChain = RunnableSequence.from([
      {
        context: RunnableSequence.from([(input) => input.question, retrievalChain]),
        question: (input) => input.question,
      },
      RunnableLambda.from((inputs) => {
        data.appendMessageAnnotation({
          type: 'search_query',
          question: inputs.question,
        });
        return inputs;
      }),
      RunnableLambda.from(clipContext(this.options?.rag?.inputTokensLimit, prompt, data)),
      RunnableLambda.from(registerContextTokenCounts(data)),
      prompt,
      this.options.model.withConfig({ metadata: { type: 'question_answer_qa' } }),
    ]);

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you should get something similar and not be exactly the same response as the last message

image

});
}
},
// callback for prompt based models (Bedrock uses ActionsClientLlm)
Expand Down