You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! Loving the Arena for quick inspection of models :)
I noticed that the scores for the retrieval are computed as dot products, as opposed to cosine similarity, even though the embeddings are not normalized. I manually added normalization during a local deployment and got significantly different results, at least for the jinaai/jina-embeddings-v2-base-en model. Do you think we can add an optional parameter to the model_meta.yml to normalize embeddings during the model.encode call? I'm happy to make a PR.
The text was updated successfully, but these errors were encountered:
Yep, that is totally correct - the https://github.com/embeddings-benchmark/mteb/blob/main/mteb/models/ folder is the gold standard reference for evaluated models.
Hi! Loving the Arena for quick inspection of models :)
I noticed that the scores for the retrieval are computed as dot products, as opposed to cosine similarity, even though the embeddings are not normalized. I manually added normalization during a local deployment and got significantly different results, at least for the
jinaai/jina-embeddings-v2-base-en
model. Do you think we can add an optional parameter to themodel_meta.yml
to normalize embeddings during themodel.encode
call? I'm happy to make a PR.The text was updated successfully, but these errors were encountered: