-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add map result support in neural search for non text embedding models #258
Conversation
@zane-neo lets not skip the change log. Also, motivation is not clear for this change. Can you add a github issue and talk about why we need this change? Is there any new feature that we are building which is going to use the new responses? |
} | ||
List<ModelTensor> tensorList = tensorOutputList.get(0).getMlModelTensors(); | ||
if (CollectionUtils.isEmpty(tensorList)) { | ||
log.error("No tensor found!"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lets make this error message more understandable and with actions what happened wrong which resulted in this error
final ModelTensorOutput modelTensorOutput = (ModelTensorOutput) mlOutput; | ||
final List<ModelTensors> tensorOutputList = modelTensorOutput.getMlModelOutputs(); | ||
if (CollectionUtils.isEmpty(tensorOutputList)) { | ||
log.error("No tensor output found!"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lets make this error message more understandable and with actions what happened wrong which resulted in this error
@@ -144,4 +174,19 @@ private List<List<Float>> buildVectorFromResponse(MLOutput mlOutput) { | |||
return vector; | |||
} | |||
|
|||
private Map<String, ?> buildMapResultFromResponse(MLOutput mlOutput) { | |||
final ModelTensorOutput modelTensorOutput = (ModelTensorOutput) mlOutput; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we check the type of mlOutput
before casting?
Sure, created this issue: #260. Yes, the SPLADE feature needs to use the new response. |
@zane-neo as this is a new feature, and will probably go through multiple iterations before it is going to be released. Hence, for all new features let's not merge the changes directly in main. Lets merge keep reviewing the changes and merge them in a feature branch. Let me create a feature branch for this new feature. So the process will go like this:
Please let me know if there is any further questions. cc: @vamshin |
@zane-neo Feature branch for sparse vector support: https://github.com/opensearch-project/neural-search/tree/feature/sparseVectorSupport Please use this branch for this and all future PRs related to SparseVector Support. |
Signed-off-by: zane-neo <[email protected]>
Signed-off-by: zane-neo <[email protected]>
5892f92
to
8dddbeb
Compare
Closing this PR since raised another PR to feature branch: #270 |
Description
Neural search only support text embedding model result which are usually in List<List> structure, for other models that returns non vector list like OpenAI, SPLADE etc, there's no available functions to get extract the json object result. This PR is to add several new functions to support these cases, the response are in Map<String, ?> structure and upper layer can extract more information from the map based on their purpose.
Issues Resolved
#260
Check List
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and signing off your commits, please check here.