-
Notifications
You must be signed in to change notification settings - Fork 138
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Require only one of 'llmQuestion' and 'llmMessages' for the RAG processor. #3067
Comments
austintlee
added
bug
Something isn't working
untriaged
v2.18.0
Issues targeting release v2.18.0
labels
Oct 7, 2024
5 tasks
austintlee
added a commit
to austintlee/ml-commons
that referenced
this issue
Oct 8, 2024
…nsearch-project#3067) Signed-off-by: Austin Lee <[email protected]>
3 tasks
b4sjoo
pushed a commit
that referenced
this issue
Oct 9, 2024
…#3072) * Allow llmQuestion to be optional when llmMessages is used. (Issue #3067) Signed-off-by: Austin Lee <[email protected]> * Remove unused lines. Signed-off-by: Austin Lee <[email protected]> --------- Signed-off-by: Austin Lee <[email protected]>
opensearch-trigger-bot bot
pushed a commit
that referenced
this issue
Oct 9, 2024
…#3072) * Allow llmQuestion to be optional when llmMessages is used. (Issue #3067) Signed-off-by: Austin Lee <[email protected]> * Remove unused lines. Signed-off-by: Austin Lee <[email protected]> --------- Signed-off-by: Austin Lee <[email protected]> (cherry picked from commit 48d275d)
dhrubo-os
pushed a commit
that referenced
this issue
Oct 9, 2024
…#3072) (#3082) * Allow llmQuestion to be optional when llmMessages is used. (Issue #3067) Signed-off-by: Austin Lee <[email protected]> * Remove unused lines. Signed-off-by: Austin Lee <[email protected]> --------- Signed-off-by: Austin Lee <[email protected]> (cherry picked from commit 48d275d) Co-authored-by: Austin Lee <[email protected]>
Can this issue be closed? |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
What is the bug?
We require that
llmQuestion
is always passed, but this is no longer the case when using Bedrocks' Converse API as input to the LLM can be passed inllmMessages
.How can one reproduce the bug?
Steps to reproduce the behavior:
What is the expected behavior?
A clear and concise description of what you expected to happen.
What is your host/environment?
Do you have any screenshots?
If applicable, add screenshots to help explain your problem.
Do you have any additional context?
Add any other context about the problem.
The text was updated successfully, but these errors were encountered: