Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Idefics: enable generation tests #34062

Merged
merged 10 commits into from
Oct 15, 2024
Merged

Conversation

zucchini-nlp
Copy link
Member

@zucchini-nlp zucchini-nlp commented Oct 10, 2024

What does this PR do?

Enables GenerationTesterMixin for Idefics models. Tests are passing for me locally.

There are a few PRs with idefics already and they need to be reviewed/merged in the following order: #33907 -> #34043 -> current PR

@zucchini-nlp zucchini-nlp requested review from gante and ArthurZucker and removed request for gante October 10, 2024 11:20
Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for debugging these tests!

@@ -4261,7 +4262,8 @@ def _assisted_decoding(
added_len,
is_decoder_attention=True,
)
else:
# some (V)LLMs have hard requirement on SDPA and thus never return attn
elif outputs.attentions[0] is not None:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ouuu very nice!

Copy link
Member

@gante gante left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, except for those two continue replacements.

(tests 💛 )

@@ -1567,13 +1572,13 @@ def test_generate_from_inputs_embeds_decoder_only(self):
# This test is for decoder-only models (encoder-decoder models have native input embeddings support in the
# decoder)
if config.is_encoder_decoder:
continue
self.skipTest("This test is for decoder-only models")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need a continue here :P

With a skip, we skip for all other models in self.all_generative_model_classes, and we might have encoder-decoder and decoder-only models mixed up together in that iterator

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm oke, didn't think we could have cases with mixed model types. Was just confused the test was passing for one model while not other, and took me a while to figure out hehe


# Skip models without explicit support
config.is_decoder = True
model = model_class(config).to(torch_device).eval()
if "inputs_embeds" not in inspect.signature(model.prepare_inputs_for_generation).parameters.keys():
continue
self.skipTest("This model doesn't accept `inputs_embeds` in generate")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Likewise, this also needs to be a continue

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@zucchini-nlp zucchini-nlp merged commit 23874f5 into huggingface:main Oct 15, 2024
24 of 25 checks passed
BernardZach pushed a commit to BernardZach/transformers that referenced this pull request Dec 5, 2024
* add idefics

* conflicts after merging main

* enable tests but need to fix some

* fix tests

* no print

* fix/skip some slow tests

* continue not skip

* rebasing broken smth, this is the fix
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants