-
Notifications
You must be signed in to change notification settings - Fork 27.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Idefics: enable generation tests #34062
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks for debugging these tests!
@@ -4261,7 +4262,8 @@ def _assisted_decoding( | |||
added_len, | |||
is_decoder_attention=True, | |||
) | |||
else: | |||
# some (V)LLMs have hard requirement on SDPA and thus never return attn | |||
elif outputs.attentions[0] is not None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ouuu very nice!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, except for those two continue
replacements.
(tests 💛 )
tests/generation/test_utils.py
Outdated
@@ -1567,13 +1572,13 @@ def test_generate_from_inputs_embeds_decoder_only(self): | |||
# This test is for decoder-only models (encoder-decoder models have native input embeddings support in the | |||
# decoder) | |||
if config.is_encoder_decoder: | |||
continue | |||
self.skipTest("This test is for decoder-only models") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need a continue here :P
With a skip, we skip for all other models in self.all_generative_model_classes
, and we might have encoder-decoder and decoder-only models mixed up together in that iterator
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmm oke, didn't think we could have cases with mixed model types. Was just confused the test was passing for one model while not other, and took me a while to figure out hehe
tests/generation/test_utils.py
Outdated
|
||
# Skip models without explicit support | ||
config.is_decoder = True | ||
model = model_class(config).to(torch_device).eval() | ||
if "inputs_embeds" not in inspect.signature(model.prepare_inputs_for_generation).parameters.keys(): | ||
continue | ||
self.skipTest("This model doesn't accept `inputs_embeds` in generate") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Likewise, this also needs to be a continue
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
* add idefics * conflicts after merging main * enable tests but need to fix some * fix tests * no print * fix/skip some slow tests * continue not skip * rebasing broken smth, this is the fix
What does this PR do?
Enables GenerationTesterMixin for Idefics models. Tests are passing for me locally.
There are a few PRs with idefics already and they need to be reviewed/merged in the following order: #33907 -> #34043 -> current PR