Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[tests] run one test but got 2 test results #35159

Open
2 of 4 tasks
faaany opened this issue Dec 9, 2024 · 1 comment
Open
2 of 4 tasks

[tests] run one test but got 2 test results #35159

faaany opened this issue Dec 9, 2024 · 1 comment
Labels

Comments

@faaany
Copy link
Contributor

faaany commented Dec 9, 2024

System Info

  • transformers version: 4.47.0.dev0
  • Platform: Linux-4.18.0-425.3.1.el8.x86_64-x86_64-with-glibc2.35
  • Python version: 3.10.12
  • Huggingface_hub version: 0.26.5
  • Safetensors version: 0.4.5
  • Accelerate version: 1.1.0.dev0
  • Accelerate config: - compute_environment: LOCAL_MACHINE
    - distributed_type: MULTI_GPU
    - mixed_precision: bf16
    - use_cpu: False
    - debug: False
    - num_processes: 2
    - machine_rank: 0
    - num_machines: 1
    - gpu_ids: all
    - rdzv_backend: static
    - same_network: True
    - main_training_function: main
    - enable_cpu_affinity: False
    - downcast_bf16: no
    - tpu_use_cluster: False
    - tpu_use_sudo: False
    - tpu_env: []
  • PyTorch version (GPU?): 2.5.1+cu121 (True)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using distributed or parallel set-up in script?:
  • Using GPU in script?:
  • GPU type: NVIDIA A100 80GB PCIe

Who can help?

@ydshieh

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

with the following test command:

pytest -rA tests/models/mra/test_modeling_mra.py::MraModelTest::test_load_with_mismatched_shapes

I will get 2 test results:

======================================================== short test summary info =========================================================
PASSED tests/models/mra/test_modeling_mra.py::MraModelTest::test_load_with_mismatched_shapes
[Testing <class 'transformers.models.mra.modeling_mra.MraForSequenceClassification'>] SUBFAIL tests/models/mra/test_modeling_mra.py::MraModelTest::test_load_with_mismatched_shapes - ValueError: sequence length must be divisible by the block_size.
================================================ 1 failed, 1 passed, 3 warnings in 6.27s =================================================

Expected behavior

I expect that this command gives_ me only one test result rather than two.

Below are the experiments I tried:

  • with pytest-subtests installed, comment out "with self.subTest(msg=f"Testing {model_class}"):", test pass with 1 test case
  • with pytest-subtests uninstalled, comment out "with self.subTest(msg=f"Testing {model_class}"):", test pass with 1 test case
  • with pytest-subtests uninstalled, test fails with 1 test case

I also saw your other PR #34806. Maybe this is another issue with pytest-subtests?

@faaany faaany added the bug label Dec 9, 2024
@ydshieh
Copy link
Collaborator

ydshieh commented Dec 9, 2024

Hi @faaany !

pytest-subtests is designed to report multiple states (regarding subtests) even for a single test method.
So far in transformers, the CI env. doesn't have pytest-subtests installed (yet).
Also pytest-subtests has some issue regarding the reporting, see this.

transformers might install pytest-subtests in the CI env. soon and we definitely need to work on some issues that haven't been observed so far. But reporting multiple states for a single test is from pytest-subtests and not something transformers could change.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants