-
Notifications
You must be signed in to change notification settings - Fork 240
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CI][PTQ] benchmark #2819
Merged
AlexanderDokuchaev
merged 13 commits into
openvinotoolkit:develop
from
AlexanderDokuchaev:ad/ptq_bench
Aug 19, 2024
Merged
[CI][PTQ] benchmark #2819
AlexanderDokuchaev
merged 13 commits into
openvinotoolkit:develop
from
AlexanderDokuchaev:ad/ptq_bench
Aug 19, 2024
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
github-actions
bot
added
the
documentation
Improvements or additions to documentation
label
Aug 14, 2024
nikita-savelyevv
approved these changes
Aug 19, 2024
AlexanderDokuchaev
added a commit
to AlexanderDokuchaev/nncf
that referenced
this pull request
Aug 23, 2024
### Changes - Update conformance tests for benchmark jobs - Add pytest-forked to run each test in own thread to more correct memory monitoring (note: not works for torch_cuda) - Change order of columns in results.csv (status should be last) - Add build_url in report.csv (to faster access to log for specific model) - Catch exception on run benchmark to avoid error status for models with dynamic shape (will be fixed after) - Fix issue with gpt2 with incorrect shape of input data. ### Related tickets 111533 ### Tests manual/job/post_training_quantization_performance/ manual/job/post_training_weight_compression_performance/
I could not run this test locally: python -m pytest tests/post_training/test_quantize_conformance.py::test_ptq_quantization -k torchvision/vit_b_16_backend_OV --no-eval -
-data /home/dlyakhov/datasets/imagenet_one_pic/ -s
================================================================================ test session starts ================================================================================
platform linux -- Python 3.8.10, pytest-8.0.2, pluggy-1.5.0
rootdir: /home/dlyakhov
plugins: mock-3.12.0, cov-4.1.0, dependency-0.6.0, xdist-3.5.0
collecting ... INFO:nncf:NNCF initialized successfully. Supported frameworks detected: torch, onnx, openvino
collected 130 items / 129 deselected / 1 selected
tests/post_training/test_quantize_conformance.py .E
====================================================================================== ERRORS =======================================================================================
____________________________________________________ ERROR at teardown of test_ptq_quantization[torchvision/vit_b_16_backend_OV] ____________________________________________________
self = <_pytest.config.Config object at 0x7f035c681700>, name = 'forked', default = <NOTSET>, skip = False
def getoption(self, name: str, default=notset, skip: bool = False):
"""Return command line option value.
:param name: Name of the option. You may also specify
the literal ``--OPT`` option instead of the "dest" option name.
:param default: Default value if no option of that name exists.
:param skip: If True, raise pytest.skip if option does not exists
or has a None value.
"""
name = self._opt2dest.get(name, name)
try:
> val = getattr(self.option, name)
E AttributeError: 'Namespace' object has no attribute 'forked'
../../env/nncf_pt/lib/python3.8/site-packages/_pytest/config/__init__.py:1646: AttributeError
The above exception was the direct cause of the following exception:
output_dir = PosixPath('tmp'), run_benchmark_app = False, pytestconfig = <_pytest.config.Config object at 0x7f035c681700>
@pytest.fixture(scope="session", name="ptq_result_data")
def fixture_ptq_report_data(output_dir, run_benchmark_app, pytestconfig):
data: Dict[str, RunInfo] = {}
yield data
if data:
test_results = OrderedDict(sorted(data.items()))
df = pd.DataFrame(v.get_result_dict() for v in test_results.values())
if not run_benchmark_app:
df = df.drop(columns=["FPS"])
output_dir.mkdir(parents=True, exist_ok=True)
output_file = output_dir / "results.csv"
> if pytestconfig.getoption("forked") and output_file.exists():
tests/post_training/test_quantize_conformance.py:156:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <_pytest.config.Config object at 0x7f035c681700>, name = 'forked', default = <NOTSET>, skip = False
def getoption(self, name: str, default=notset, skip: bool = False):
"""Return command line option value.
:param name: Name of the option. You may also specify
the literal ``--OPT`` option instead of the "dest" option name.
:param default: Default value if no option of that name exists.
:param skip: If True, raise pytest.skip if option does not exists
or has a None value.
"""
name = self._opt2dest.get(name, name)
try:
val = getattr(self.option, name)
if val is None and skip:
raise AttributeError(name)
return val
except AttributeError as e:
if default is not notset:
return default
if skip:
import pytest
pytest.skip(f"no {name!r} option found")
> raise ValueError(f"no option named {name!r}") from e
E ValueError: no option named 'forked'
../../env/nncf_pt/lib/python3.8/site-packages/_pytest/config/__init__.py:1657: ValueError
================================================================================= warnings summary ==================================================================================
Projects/nncf/tests/post_training/test_quantize_conformance.py::test_ptq_quantization[torchvision/vit_b_16_backend_OV]
/home/dlyakhov/env/nncf_pt/lib/python3.8/site-packages/torch/_tensor.py:986: TracerWarning: Converting a tensor to a Python number might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
return self.item().__format__(format_spec)
Projects/nncf/tests/post_training/test_quantize_conformance.py::test_ptq_quantization[torchvision/vit_b_16_backend_OV]
/home/dlyakhov/env/nncf_pt/lib/python3.8/site-packages/torch/__init__.py:1777: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
assert condition, message
Projects/nncf/tests/post_training/test_quantize_conformance.py::test_ptq_quantization[torchvision/vit_b_16_backend_OV]
/home/dlyakhov/Projects/nncf/nncf/quantization/algorithms/post_training/pipeline.py:87: FutureWarning: `AdvancedQuantizationParameters(smooth_quant_alpha=..)` is deprecated.Please, use `AdvancedQuantizationParameters(smooth_quant_alphas)` option with AdvancedSmoothQuantParameters(convolution=.., matmul=..) as value instead.
warning_deprecated(
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
============================================================================== short test summary info ==============================================================================
ERROR tests/post_training/test_quantize_conformance.py::test_ptq_quantization[torchvision/vit_b_16_backend_OV] - ValueError: no option named 'forked' When I'm trying to use python -m pytest tests/post_training/test_quantize_conformance.py::test_ptq_quantization -k torchvision/vit_b_16_backend_OV --no-eval --data /home/dlyakhov/datasets/imagenet_one_pic/ -s --forked
ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]
__main__.py: error: unrecognized arguments: --forked
inifile: None
rootdir: /home/dlyakhov
|
@daniil-lyakhov |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
documentation
Improvements or additions to documentation
NNCF PTQ
Pull requests that updates NNCF PTQ
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Changes
Related tickets
111533
Tests
manual/job/post_training_quantization_performance/
manual/job/post_training_weight_compression_performance/