Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix paligemma detection inference #31587

Merged

Conversation

molbap
Copy link
Contributor

@molbap molbap commented Jun 25, 2024

What does this PR do?

  • Extends the extended attention mask of paligemma, fixing an inference bug especially for detection.
  • Adds an object detection integration test to avoid future regressions

Fixes #31425

hf hub discussion: https://huggingface.co/google/paligemma-3b-mix-448/discussions/6

Who can review?

@pcuenca @ArthurZucker

@molbap
Copy link
Contributor Author

molbap commented Jun 25, 2024

cc @zucchini-nlp, a generation bugfix you might want to look at

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Member

@pcuenca pcuenca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Verified it works, I can now replicate the same results with and without use_cache. Thank you! 🔥

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks 🤗

@ArthurZucker ArthurZucker merged commit 492ee17 into huggingface:main Jun 26, 2024
22 checks passed
@edmondja
Copy link

edmondja commented Jul 9, 2024

Is this fix already integrated to the last release https://github.com/huggingface/transformers/releases ? I am not sure release notes are supposed to be exhaustive @ArthurZucker (PS: merci de représenter le MVA)

@ArthurZucker
Copy link
Collaborator

Hey!
image
this has been in the last 2 patch releases! They release notes for 4.42.0 should have it!

@ArthurZucker
Copy link
Collaborator

😉 sympa de croiser un alumni!

@edmondja
Copy link

Hey! image this has been in the last 2 patch releases! They release notes for 4.42.0 should have it!

I really don't see it, no mention of "paligemma" "slow" or "extended" in the page I sent for "v4.42.0: Gemma 2, RTDETR, InstructBLIP, LLAVa Next, New Model Adder" but maybe I have "merde dans les yeux". (oui c'est sympa surtout quand on voit le succès de certains comme toi)

So to conclude maybe I don't see it, and if its the case please accept my apologies for wasting your time. Either way thank you for confirming me the fix was added, good continuation.

@ArthurZucker
Copy link
Collaborator

It is at the bottom:
image

and absolutely no worries, we know this was quite important for everyone! 🤗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

use_cache=False makes a huge difference in Paligemma
5 participants