You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Support for RAG with Q&A Model via Connector. Currently we can do RAG only with text generation models
I am using Q&A model for RAG
The Q&A Model takes two inputs as question and context
{
"question": "where live??",
"context": "My name is Clara and I live in Madrid."
}
While doing neural search how i will pass the parameter for question and context
Where should i add the context? Or utilizing two paramters is not possible in OpenSearch RAG?
question and answer is not part of my document/index records. Only neural field is part of my document which is vectorized with neural_field_vector.
When i do neural search by asking question it gives the response of neural_field now the question that i will ask to neural search will give the response of neural_field which is working perfectly.
Now this same question should go to my LLM as question and the neural_field which came as a response from neural search should map with the context .
In llm RAG i do not have the mapping ways, all i have is context_field_list
which is part of document and not the LLM parameter.
We need to support this, there can be multiple parameters in the models as model advances.
The text was updated successfully, but these errors were encountered:
@mingshl could we solve this with a generic ml search processors?
for the ml inference search processors, yes, it can use the QA model to achieve the QA search. the generative_qa_parameters can be put in the model_config parameters in the inference processors.
users can pipe up search request processor, either using ml_inference search request processors or text_embedding_request processors, both should work
then user should be able to use standard neural queries, the context will be a
Support for RAG with Q&A Model via Connector. Currently we can do RAG only with text generation models
I am using Q&A model for RAG
The Q&A Model takes two inputs as
question
andcontext
While doing neural search how i will pass the parameter for
question
andcontext
Where should i add the context? Or utilizing two paramters is not possible in OpenSearch RAG?
question and answer is not part of my document/index records. Only neural field is part of my document which is vectorized with neural_field_vector.
When i do neural search by asking question it gives the response of neural_field now the question that i will ask to neural search will give the response of neural_field which is working perfectly.
Now this same question should go to my LLM as question and the neural_field which came as a response from neural search should map with the context .
In llm RAG i do not have the mapping ways, all i have is context_field_list
which is part of document and not the LLM parameter.
We need to support this, there can be multiple parameters in the models as model advances.
The text was updated successfully, but these errors were encountered: