Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CI][PTQ] benchmark #2819

Merged

Conversation

AlexanderDokuchaev
Copy link
Collaborator

@AlexanderDokuchaev AlexanderDokuchaev commented Jul 16, 2024

Changes

  • Update conformance tests for benchmark jobs
  • Add pytest-forked to run each test in own thread to more correct memory monitoring (note: not works for torch_cuda)
  • Change order of columns in results.csv (status should be last)
  • Add build_url in report.csv (to faster access to log for specific model)
  • Catch exception on run benchmark to avoid error status for models with dynamic shape (will be fixed after)
  • Fix issue with gpt2 with incorrect shape of input data.

Related tickets

111533

Tests

manual/job/post_training_quantization_performance/
manual/job/post_training_weight_compression_performance/

@github-actions github-actions bot added the NNCF PTQ Pull requests that updates NNCF PTQ label Jul 16, 2024
@github-actions github-actions bot added the NNCF Common Pull request that updates NNCF Common label Aug 13, 2024
@github-actions github-actions bot removed the NNCF Common Pull request that updates NNCF Common label Aug 13, 2024
@github-actions github-actions bot added the documentation Improvements or additions to documentation label Aug 14, 2024
@AlexanderDokuchaev AlexanderDokuchaev marked this pull request as ready for review August 14, 2024 12:43
@AlexanderDokuchaev AlexanderDokuchaev requested a review from a team as a code owner August 14, 2024 12:43
@AlexanderDokuchaev AlexanderDokuchaev merged commit b882992 into openvinotoolkit:develop Aug 19, 2024
13 checks passed
AlexanderDokuchaev added a commit to AlexanderDokuchaev/nncf that referenced this pull request Aug 23, 2024
### Changes

- Update conformance tests for benchmark jobs
- Add pytest-forked to run each test in own thread to more correct
memory monitoring (note: not works for torch_cuda)
- Change order of columns in results.csv (status should be last)
- Add build_url in report.csv (to faster access to log for specific
model)
- Catch exception on run benchmark to avoid error status for models with
dynamic shape (will be fixed after)
- Fix issue with gpt2 with incorrect shape of input data.


### Related tickets
111533

### Tests

manual/job/post_training_quantization_performance/
manual/job/post_training_weight_compression_performance/
@daniil-lyakhov
Copy link
Collaborator

I could not run this test locally:

python -m pytest tests/post_training/test_quantize_conformance.py::test_ptq_quantization -k torchvision/vit_b_16_backend_OV --no-eval -
-data /home/dlyakhov/datasets/imagenet_one_pic/ -s
================================================================================ test session starts ================================================================================
platform linux -- Python 3.8.10, pytest-8.0.2, pluggy-1.5.0
rootdir: /home/dlyakhov
plugins: mock-3.12.0, cov-4.1.0, dependency-0.6.0, xdist-3.5.0
collecting ... INFO:nncf:NNCF initialized successfully. Supported frameworks detected: torch, onnx, openvino
collected 130 items / 129 deselected / 1 selected                                                                                                                                   

tests/post_training/test_quantize_conformance.py .E

====================================================================================== ERRORS =======================================================================================
____________________________________________________ ERROR at teardown of test_ptq_quantization[torchvision/vit_b_16_backend_OV] ____________________________________________________

self = <_pytest.config.Config object at 0x7f035c681700>, name = 'forked', default = <NOTSET>, skip = False

    def getoption(self, name: str, default=notset, skip: bool = False):
        """Return command line option value.
    
        :param name: Name of the option.  You may also specify
            the literal ``--OPT`` option instead of the "dest" option name.
        :param default: Default value if no option of that name exists.
        :param skip: If True, raise pytest.skip if option does not exists
            or has a None value.
        """
        name = self._opt2dest.get(name, name)
        try:
>           val = getattr(self.option, name)
E           AttributeError: 'Namespace' object has no attribute 'forked'

../../env/nncf_pt/lib/python3.8/site-packages/_pytest/config/__init__.py:1646: AttributeError

The above exception was the direct cause of the following exception:

output_dir = PosixPath('tmp'), run_benchmark_app = False, pytestconfig = <_pytest.config.Config object at 0x7f035c681700>

    @pytest.fixture(scope="session", name="ptq_result_data")
    def fixture_ptq_report_data(output_dir, run_benchmark_app, pytestconfig):
        data: Dict[str, RunInfo] = {}
    
        yield data
    
        if data:
            test_results = OrderedDict(sorted(data.items()))
            df = pd.DataFrame(v.get_result_dict() for v in test_results.values())
            if not run_benchmark_app:
                df = df.drop(columns=["FPS"])
    
            output_dir.mkdir(parents=True, exist_ok=True)
            output_file = output_dir / "results.csv"
    
>           if pytestconfig.getoption("forked") and output_file.exists():

tests/post_training/test_quantize_conformance.py:156: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <_pytest.config.Config object at 0x7f035c681700>, name = 'forked', default = <NOTSET>, skip = False

    def getoption(self, name: str, default=notset, skip: bool = False):
        """Return command line option value.
    
        :param name: Name of the option.  You may also specify
            the literal ``--OPT`` option instead of the "dest" option name.
        :param default: Default value if no option of that name exists.
        :param skip: If True, raise pytest.skip if option does not exists
            or has a None value.
        """
        name = self._opt2dest.get(name, name)
        try:
            val = getattr(self.option, name)
            if val is None and skip:
                raise AttributeError(name)
            return val
        except AttributeError as e:
            if default is not notset:
                return default
            if skip:
                import pytest
    
                pytest.skip(f"no {name!r} option found")
>           raise ValueError(f"no option named {name!r}") from e
E           ValueError: no option named 'forked'

../../env/nncf_pt/lib/python3.8/site-packages/_pytest/config/__init__.py:1657: ValueError
================================================================================= warnings summary ==================================================================================
Projects/nncf/tests/post_training/test_quantize_conformance.py::test_ptq_quantization[torchvision/vit_b_16_backend_OV]
  /home/dlyakhov/env/nncf_pt/lib/python3.8/site-packages/torch/_tensor.py:986: TracerWarning: Converting a tensor to a Python number might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
    return self.item().__format__(format_spec)

Projects/nncf/tests/post_training/test_quantize_conformance.py::test_ptq_quantization[torchvision/vit_b_16_backend_OV]
  /home/dlyakhov/env/nncf_pt/lib/python3.8/site-packages/torch/__init__.py:1777: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
    assert condition, message

Projects/nncf/tests/post_training/test_quantize_conformance.py::test_ptq_quantization[torchvision/vit_b_16_backend_OV]
  /home/dlyakhov/Projects/nncf/nncf/quantization/algorithms/post_training/pipeline.py:87: FutureWarning: `AdvancedQuantizationParameters(smooth_quant_alpha=..)` is deprecated.Please, use `AdvancedQuantizationParameters(smooth_quant_alphas)` option with AdvancedSmoothQuantParameters(convolution=.., matmul=..) as value instead.
    warning_deprecated(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
============================================================================== short test summary info ==============================================================================
ERROR tests/post_training/test_quantize_conformance.py::test_ptq_quantization[torchvision/vit_b_16_backend_OV] - ValueError: no option named 'forked'

When I'm trying to use --forked it does not work either

python -m pytest tests/post_training/test_quantize_conformance.py::test_ptq_quantization -k torchvision/vit_b_16_backend_OV --no-eval --data /home/dlyakhov/datasets/imagenet_one_pic/ -s --forked
ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]
__main__.py: error: unrecognized arguments: --forked
  inifile: None
  rootdir: /home/dlyakhov

@ljaljushkin
Copy link
Contributor

ljaljushkin commented Aug 30, 2024

@daniil-lyakhov
most probably pytest-fork dependency is missing in your env - https://github.com/openvinotoolkit/nncf/blob/develop/constraints.txt#L23

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation NNCF PTQ Pull requests that updates NNCF PTQ
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants