Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KeyError: '__module.model.visual.trunk.attn_pool.q/aten::linear/MatMul/fq_weights_1' #3118

Closed
1 task
azhuvath opened this issue Nov 27, 2024 · 4 comments
Closed
1 task
Assignees
Labels
bug Something isn't working

Comments

@azhuvath
Copy link

🐛 Describe the bug

Was doing accuracy aware quantization for the model Marqo/marqo-fashionSigLIP . It was working for more than 2 hours+ and suddenly generated error. It was also not converging very fast.

python accuracy_aware_quantization.py
INFO:nncf:NNCF initialized successfully. Supported frameworks detected: torch, openvino
Fetching 810 samples for the initialization...
0%| | 0/810 [01:58<?, ?it/s]
Fetching 810 samples for the initialization...
0%| | 0/810 [01:58<?, ?it/s]
Statistics collection ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 100% 300/300 • 0:01:12 • 0:00:00
Applying Fast Bias correction ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 100% 127/127 • 0:00:06 • 0:00:00
INFO:nncf:Validation of initial model was started
INFO:nncf:Elapsed Time: 00:00:00
INFO:nncf:Elapsed Time: 00:04:06
INFO:nncf:Metric of initial model: 0.07466079314624856
INFO:nncf:Collecting values for each data item using the initial model
INFO:nncf:Elapsed Time: 00:04:13
INFO:nncf:Validation of quantized model was started
INFO:nncf:Elapsed Time: 00:00:02
INFO:nncf:Elapsed Time: 00:01:30
INFO:nncf:Metric of quantized model: 0.02710435327486572
INFO:nncf:Collecting values for each data item using the quantized model
INFO:nncf:Elapsed Time: 00:01:37
INFO:nncf:Accuracy drop: 0.04755643987138283 (absolute)
INFO:nncf:Accuracy drop: 0.04755643987138283 (absolute)
INFO:nncf:Total number of quantized operations in the model: 284
INFO:nncf:Number of parallel workers to rank quantized operations: 3
INFO:nncf:ORIGINAL metric is used to rank quantizers
INFO:nncf:Calculating ranking scores
INFO:nncf:Elapsed Time: 00:18:40
INFO:nncf:Changing the scope of quantizer nodes was started
INFO:nncf:Reverted 2 operations to the floating-point precision:
__module.model.text.transformer.resblocks.0/aten::add/Add_1
__module.model.text.transformer.resblocks.0.ln_2/aten::layer_norm/MVN
INFO:nncf:Accuracy drop with the new quantization scope is 0.049556315244946525 (absolute)
INFO:nncf:Reverted 2 operations to the floating-point precision:
__module.model.text.transformer.resblocks.9.ln_2/aten::layer_norm/MVN
__module.model.text.transformer.resblocks.9/aten::add/Add_1
INFO:nncf:Accuracy drop with the new quantization scope is 0.04385334349929537 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.visual.trunk.blocks.0.mlp.fc2/aten::linear/MatMul
INFO:nncf:Accuracy drop with the new quantization scope is 0.04903056091956672 (absolute)
INFO:nncf:Re-calculating ranking scores for remaining groups
INFO:nncf:Calculating ranking scores
INFO:nncf:Elapsed Time: 00:18:51
INFO:nncf:Reverted 2 operations to the floating-point precision:
__module.model.text.transformer.resblocks.8/aten::add/Add_1
__module.model.text.transformer.resblocks.8.ln_2/aten::layer_norm/MVN
INFO:nncf:Accuracy drop with the new quantization scope is 0.04314702026468723 (absolute)
INFO:nncf:Reverted 2 operations to the floating-point precision:
__module.model.text.transformer.resblocks.4/aten::add/Add_1
__module.model.text.transformer.resblocks.4.ln_2/aten::layer_norm/MVN
INFO:nncf:Accuracy drop with the new quantization scope is 0.053300969254331435 (absolute)
INFO:nncf:Re-calculating ranking scores for remaining groups
INFO:nncf:Calculating ranking scores
INFO:nncf:Elapsed Time: 00:18:58
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.11.attn/aten::_native_multi_head_attention/MatMul_5
INFO:nncf:Accuracy drop with the new quantization scope is 0.04706426297524699 (absolute)
INFO:nncf:Reverted 3 operations to the floating-point precision:
__module.model.text.transformer.resblocks.7.attn/aten::_native_multi_head_attention/MatMul_1
__module.model.text.transformer.resblocks.7.attn/aten::_native_multi_head_attention/MatMul_3
__module.model.text.transformer.resblocks.7.attn/aten::_native_multi_head_attention/MatMul
INFO:nncf:Accuracy drop with the new quantization scope is 0.04877497193020035 (absolute)
INFO:nncf:Re-calculating ranking scores for remaining groups
INFO:nncf:Calculating ranking scores
INFO:nncf:Elapsed Time: 00:18:50
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.3.attn/aten::_native_multi_head_attention/MatMul_5
INFO:nncf:Accuracy drop with the new quantization scope is 0.04464227463266089 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.6.attn/aten::_native_multi_head_attention/Multiply_2
INFO:nncf:Accuracy drop with the new quantization scope is 0.044798036204147 (absolute)
INFO:nncf:Re-calculating ranking scores for remaining groups
INFO:nncf:Calculating ranking scores
INFO:nncf:Elapsed Time: 00:19:09
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.1.attn/aten::_native_multi_head_attention/MatMul_5
INFO:nncf:Accuracy drop with the new quantization scope is 0.04424363289904658 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.1.attn/aten::_native_multi_head_attention/MatMul_4
INFO:nncf:Accuracy drop with the new quantization scope is 0.04573362917440334 (absolute)
INFO:nncf:Re-calculating ranking scores for remaining groups
INFO:nncf:Calculating ranking scores
INFO:nncf:Elapsed Time: 00:18:55
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.6.attn/aten::_native_multi_head_attention/Multiply_2
INFO:nncf:Accuracy drop with the new quantization scope is 0.044127690084904254 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.7.mlp.c_proj/aten::linear/MatMul
INFO:nncf:Accuracy drop with the new quantization scope is 0.043617145540100266 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.8.mlp.c_proj/aten::linear/MatMul
INFO:nncf:Accuracy drop with the new quantization scope is 0.04460481419646217 (absolute)
INFO:nncf:Re-calculating ranking scores for remaining groups
INFO:nncf:Calculating ranking scores
INFO:nncf:Elapsed Time: 00:19:50
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.6.mlp.c_proj/aten::linear/MatMul
INFO:nncf:Accuracy drop with the new quantization scope is 0.044974415165658706 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.11.attn/aten::_native_multi_head_attention/Multiply_2
INFO:nncf:Accuracy drop with the new quantization scope is 0.04374533338187629 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
Multiply_11214
INFO:nncf:Accuracy drop with the new quantization scope is 0.044325064887353376 (absolute)
INFO:nncf:Re-calculating ranking scores for remaining groups
INFO:nncf:Calculating ranking scores
INFO:nncf:Elapsed Time: 00:20:12
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.visual.trunk.blocks.10.mlp.fc1/aten::linear/MatMul
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.visual.trunk.attn_pool.q/aten::linear/MatMul
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
Multiply_11202
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
Multiply_11204
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.1.mlp.c_proj/aten::linear/MatMul
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
Multiply_11206
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.2.mlp.c_fc/aten::linear/MatMul
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.2.mlp.c_proj/aten::linear/MatMul
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
Multiply_11208
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.3.mlp.c_proj/aten::linear/MatMul
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
Multiply_11210
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
Multiply_11212
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.5.mlp.c_fc/aten::linear/MatMul
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.5.mlp.c_proj/aten::linear/MatMul
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
Multiply_11216
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
Multiply_11218
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
Multiply_11220
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
Multiply_11222
INFO:nncf:Accuracy drop with the new quantization scope is 0.04229756437872184 (absolute)
INFO:nncf:Reverted 1 operations to the floating-point precision:
__module.model.text.transformer.resblocks.1.mlp.c_fc/aten::linear/MatMul
INFO:nncf:Accuracy drop with the new quantization scope is 0.04258855812235635 (absolute)
INFO:nncf:Re-calculating ranking scores for remaining groups
INFO:nncf:Calculating ranking scores
Traceback (most recent call last):
File "/home/sdp/benchmarking/accuracy_aware_quantization.py", line 223, in
quantized_model = nncf.quantize_with_accuracy_control(
File "/home/sdp/benchmarking/siglip_env/lib/python3.10/site-packages/nncf/telemetry/decorator.py", line 79, in wrapped
retval = fn(*args, **kwargs)
File "/home/sdp/benchmarking/siglip_env/lib/python3.10/site-packages/nncf/quantization/quantize_model.py", line 368, in quantize_with_accuracy_control
return quantize_with_accuracy_control_impl(
File "/home/sdp/benchmarking/siglip_env/lib/python3.10/site-packages/nncf/openvino/quantization/quantize_model.py", line 299, in quantize_with_accuracy_control_impl
quantized_model = accuracy_restorer.apply(
File "/home/sdp/benchmarking/siglip_env/lib/python3.10/site-packages/nncf/quantization/algorithms/accuracy_control/algorithm.py", line 214, in apply
return self._apply(
File "/home/sdp/benchmarking/siglip_env/lib/python3.10/site-packages/nncf/quantization/algorithms/accuracy_control/algorithm.py", line 383, in _apply
ranked_groups = ranker.rank_groups_of_quantizers(
File "/home/sdp/benchmarking/siglip_env/lib/python3.10/site-packages/nncf/quantization/algorithms/accuracy_control/ranker.py", line 171, in rank_groups_of_quantizers
ranking_scores = self._multithreading_calculation_ranking_score(
File "/home/sdp/benchmarking/siglip_env/lib/python3.10/site-packages/nncf/quantization/algorithms/accuracy_control/ranker.py", line 235, in _multithreading_calculation_ranking_score
modified_model = revert_operations_to_floating_point_precision(
File "/home/sdp/benchmarking/siglip_env/lib/python3.10/site-packages/nncf/common/quantization/quantizer_removal.py", line 173, in revert_operations_to_floating_point_precision
transformed_model = model_transformer.transform(transformation_layout)
File "/home/sdp/benchmarking/siglip_env/lib/python3.10/site-packages/nncf/openvino/graph/model_transformer.py", line 141, in transform
model = transformation_fn(model, transformations)
File "/home/sdp/benchmarking/siglip_env/lib/python3.10/site-packages/nncf/openvino/graph/model_transformer.py", line 235, in _apply_fq_nodes_removing_transformation
node = name_to_node_mapping[transformation.target_point.target_node_name]
KeyError: '__module.model.visual.trunk.attn_pool.q/aten::linear/MatMul/fq_weights_1'

Environment

Intel(R) Xeon(R) Platinum 8480+ ,
Linux 695306-jf5b5r1u37 6.8.0-48-generic #48~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Mon Oct 7 11:24:13 UTC 2 x86_64 x86_64 x86_64 GNU/Linux ,
Python: 3.10.12

How to create the environment?

python3 -m venv siglip_env
source siglip_env/bin/activate
pip install --upgrade pip
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu
pip install pandas requests tqdm timm open-clip-torch transformers scikit-learn openvino nncf seaborn

Minimal Reproducible Example

Followed the notebook https://docs.openvino.ai/2024/notebooks/clip-zero-shot-classification-with-output.html

Changed the task to classification by giving images and classes. Giving the output from original model with different categories like "gender", "masterCategory", "subCategory", "articleType", "baseColour", "season", "usage".

Category: gender
Confusion Matrix:
[[ 34 0 4 0 5]
[ 12 9 78 0 6]
[ 37 58 911 0 2]
[ 4 134 214 358 117]
[ 0 0 1 0 16]]
Classification Report:
precision recall f1-score support

    Boys       0.39      0.79      0.52        43
  Unisex       0.04      0.09      0.06       105
     Men       0.75      0.90      0.82      1008
   Women       1.00      0.43      0.60       827
   Girls       0.11      0.94      0.20        17

accuracy                           0.66      2000

macro avg 0.46 0.63 0.44 2000
weighted avg 0.81 0.66 0.68 2000

Category: masterCategory
Confusion Matrix:
[[403 0 0 0 0 10]
[ 22 320 15 3 0 136]
[ 0 3 906 14 0 43]
[ 0 42 0 61 0 16]
[ 0 4 1 0 0 0]
[ 0 0 0 0 0 1]]
Classification Report:
precision recall f1-score support

  Footwear       0.95      0.98      0.96       413

Accessories 0.87 0.65 0.74 496
Apparel 0.98 0.94 0.96 966
Personal Care 0.78 0.51 0.62 119
Free Items 0.00 0.00 0.00 5
Sporting Goods 0.00 1.00 0.01 1

  accuracy                           0.85      2000
 macro avg       0.60      0.68      0.55      2000

weighted avg 0.93 0.85 0.88 2000

Category: subCategory
Confusion Matrix:
[[ 0 0 0 ... 0 0 0]
[ 0 0 0 ... 0 0 0]
[ 0 0 0 ... 0 1 0]
...
[ 0 0 0 ... 2 0 0]
[ 0 0 0 ... 0 24 0]
[ 0 0 0 ... 0 0 30]]
Classification Report:
precision recall f1-score support

          Free Gifts       0.00      0.00      0.00         5
         Accessories       0.00      0.00      0.00         5
                Eyes       0.00      0.00      0.00         3
                Ties       0.82      0.90      0.86        10
             Watches       0.98      1.00      0.99       114
           Skin Care       0.00      0.00      0.00         1
              Makeup       0.44      0.57      0.50        14
            Mufflers       0.00      0.00      0.00         2
  Beauty Accessories       0.00      0.00      0.00         1
           Fragrance       0.96      0.94      0.95        54
    Sports Equipment       0.03      1.00      0.06         1
               Dress       0.71      0.59      0.65        17
          Flip Flops       0.29      1.00      0.45        40
             Wallets       0.86      1.00      0.92        43
           Jewellery       0.98      0.82      0.89        50
         Apparel Set       0.15      1.00      0.26         6
           Innerwear       0.91      0.94      0.93        86
           Cufflinks       0.25      0.88      0.39         8

Loungewear and Nightwear 0.54 0.61 0.57 23
Nails 1.00 0.06 0.11 17
Sandal 0.55 0.27 0.36 44
Topwear 0.99 0.89 0.93 696
Eyewear 0.95 1.00 0.98 40
Stoles 0.22 0.50 0.31 4
Socks 1.00 1.00 1.00 38
Bags 1.00 0.83 0.91 129
Saree 0.92 1.00 0.96 12
Headwear 0.68 0.87 0.76 15
Shoes 1.00 0.77 0.87 329
Bath and Body 0.00 0.00 0.00 1
Bottomwear 0.89 0.80 0.85 126
Scarves 0.62 1.00 0.76 8
Hair 1.00 1.00 1.00 2
Lips 0.69 0.92 0.79 26
Belts 0.86 1.00 0.92 30

            accuracy                           0.84      2000
           macro avg       0.58      0.66      0.57      2000
        weighted avg       0.91      0.84      0.86      2000

Category: articleType
Confusion Matrix:
[[ 3 0 0 ... 0 0 0]
[ 0 12 0 ... 0 0 0]
[ 0 0 7 ... 0 0 0]
...
[ 0 0 0 ... 0 0 0]
[ 0 0 0 ... 0 29 0]
[ 0 0 0 ... 0 0 1]]
Classification Report:
precision recall f1-score support

           Boxers       0.23      1.00      0.38         3
       Nightdress       0.60      0.92      0.73        13
           Skirts       1.00      1.00      1.00         7
           Capris       0.00      0.00      0.00        13
          Dupatta       0.00      0.00      0.00         3
     Clothing Set       0.20      1.00      0.33         1
             Caps       1.00      0.93      0.97        15
    Lounge Shorts       0.17      0.50      0.25         2

Highlighter and Blush 0.00 0.00 0.00 2
Wallets 1.00 0.84 0.91 43
Fragrance Gift Set 0.33 0.33 0.33 3
Hair Colour 0.67 1.00 0.80 2
Perfume and Body Mist 0.74 0.83 0.78 35
Cufflinks 0.80 0.50 0.62 8
Basketballs 0.14 1.00 0.25 1
Stoles 0.00 0.00 0.00 4
Lip Plumper 0.00 0.00 0.00 1
Sweatshirts 0.75 0.75 0.75 16
Jeans 0.81 1.00 0.90 30
Suspenders 0.25 1.00 0.40 1
Bracelet 0.57 1.00 0.73 4
Sunscreen 0.00 0.00 0.00 1
Salwar 0.67 0.67 0.67 3
Dresses 0.80 0.25 0.38 16
Earrings 0.76 1.00 0.87 13
Sports Shoes 0.83 0.77 0.80 94
Trunk 0.00 0.00 0.00 9
Sandals 0.50 0.05 0.09 40
Belts 0.97 1.00 0.98 30
Free Gifts 0.00 0.00 0.00 4
Compact 0.00 0.00 0.00 2
Churidar 0.00 0.00 0.00 2
Watches 0.98 1.00 0.99 114
Waistcoat 0.00 0.00 0.00 1
Jewellery Set 1.00 0.33 0.50 3
Kajal and Eyeliner 1.00 0.29 0.44 7
Heels 0.87 0.24 0.38 54
Sports Sandals 0.13 1.00 0.23 5
Flip Flops 0.38 1.00 0.56 40
Rain Jacket 0.20 1.00 0.33 1
Nail Essentials 0.00 0.00 0.00 1
Nail Polish 1.00 0.18 0.30 17
Beauty Accessory 0.00 0.00 0.00 1
Tunics 0.04 0.75 0.08 8
Laptop Bag 0.17 1.00 0.29 3
Lip Liner 0.29 1.00 0.44 2
Tshirts 0.89 0.85 0.87 332
Track Pants 0.65 0.92 0.76 12
Bangle 0.00 0.00 0.00 3
Ring 1.00 1.00 1.00 6
Kurtis 0.00 0.00 0.00 9
Kurta Sets 0.33 0.20 0.25 5
Jackets 0.50 0.14 0.22 7
Backpacks 1.00 0.67 0.81 46
Mascara 0.50 1.00 0.67 1
Messenger Bag 0.09 1.00 0.17 2
Ties 0.90 0.90 0.90 10
Trousers 1.00 0.64 0.78 25
Briefs 1.00 0.97 0.99 37
Sunglasses 0.98 1.00 0.99 40
Mufflers 0.00 0.00 0.00 2
Tracksuits 0.67 1.00 0.80 2
Foundation and Primer 0.67 0.50 0.57 4
Mobile Pouch 0.00 0.00 0.00 1
Kurtas 0.00 0.00 0.00 90
Casual Shoes 0.81 0.75 0.78 124
Robe 0.00 0.00 0.00 1
Necklace and Chains 0.90 0.90 0.90 10
Sweaters 1.00 0.44 0.61 16
Socks 1.00 1.00 1.00 38
Stockings 1.00 1.00 1.00 1
Clutches 0.43 0.90 0.58 10
Shirts 0.75 0.83 0.79 141
Scarves 0.69 1.00 0.82 9
Lipstick 0.86 0.86 0.86 14
Deodorant 0.64 0.56 0.60 16
Pendant 0.79 1.00 0.88 11
Shorts 0.78 0.67 0.72 21
Sarees 0.92 1.00 0.96 12
Flats 0.20 0.07 0.11 27
Lounge Pants 0.10 0.33 0.15 3
Handbags 1.00 0.22 0.36 63
Lip Gloss 0.39 0.78 0.52 9
Tops 0.95 0.25 0.40 71
Duffel Bag 0.14 1.00 0.24 4
Innerwear Vests 0.19 0.62 0.29 8
Leggings 0.60 1.00 0.75 6
Concealer 0.17 1.00 0.29 1
Formal Shoes 0.61 0.79 0.69 29
Jumpsuit 1.00 1.00 1.00 1
Accessory Gift Set 1.00 1.00 1.00 5
Patiala 0.00 0.00 0.00 4
Bra 1.00 1.00 1.00 29
Night suits 1.00 0.25 0.40 4

         accuracy                           0.69      2000
        macro avg       0.52      0.60      0.49      2000
     weighted avg       0.76      0.69      0.68      2000

Category: baseColour
Confusion Matrix:
[[237 7 2 ... 2 42 0]
[ 0 37 0 ... 0 0 0]
[ 0 1 1 ... 0 5 0]
...
[ 2 0 0 ... 0 0 0]
[ 0 0 0 ... 0 2 0]
[ 0 0 0 ... 0 0 0]]
Classification Report:
precision recall f1-score support

        Black       0.92      0.50      0.65       473
       Yellow       0.63      0.88      0.73        42
        Cream       0.09      0.05      0.07        19
        Mauve       0.00      0.00      0.00         4
       Orange       0.48      0.42      0.44        24
        Beige       0.42      0.55      0.47        33
        Brown       1.00      0.02      0.04       140
       Maroon       0.00      0.00      0.00        23
       Silver       0.39      0.45      0.42        42
         Grey       0.50      0.03      0.05       117
        Green       0.77      0.30      0.43        99

Turquoise Blue 0.01 0.50 0.02 2
Burgundy 0.02 1.00 0.04 1
Blue 0.88 0.06 0.12 224
Magenta 0.05 0.33 0.09 6
Steel 0.10 0.12 0.11 17
Coffee Brown 0.01 1.00 0.02 1
Khaki 0.07 0.75 0.13 4
Metallic 0.00 0.00 0.00 1
Bronze 0.00 0.00 0.00 1
Peach 0.00 0.00 0.00 14
Teal 0.08 0.10 0.09 10
Gold 0.55 0.67 0.60 27
Multi 0.00 0.00 0.00 14
Nude 0.03 0.50 0.05 2
Olive 0.14 0.07 0.09 15
Navy Blue 0.41 0.65 0.50 100
Copper 0.00 0.00 0.00 7
Mushroom Brown 0.00 0.00 0.00 1
Grey Melange 0.02 1.00 0.04 4
Tan 0.00 0.00 0.00 2
Rose 0.00 0.00 0.00 1
Pink 0.66 0.54 0.60 72
Skin 0.29 0.20 0.24 10
White 0.85 0.14 0.24 241
Red 0.58 0.66 0.62 99
Lavender 0.50 0.38 0.43 8
Purple 0.78 0.55 0.64 66
Mustard 0.50 0.20 0.29 10
Fluorescent Green 0.02 1.00 0.03 1
Charcoal 0.00 0.00 0.00 13
Off White 0.01 0.25 0.02 8
Rust 0.00 0.00 0.00 2

     accuracy                           0.33      2000
    macro avg       0.27      0.32      0.19      2000
 weighted avg       0.71      0.33      0.37      2000

Category: season
Confusion Matrix:
[[125 63 48 132]
[245 190 121 438]
[ 33 5 43 59]
[138 76 69 215]]
Classification Report:
precision recall f1-score support

  Winter       0.23      0.34      0.28       368
  Summer       0.57      0.19      0.29       994
  Spring       0.15      0.31      0.20       140
    Fall       0.25      0.43      0.32       498

accuracy                           0.29      2000

macro avg 0.30 0.32 0.27 2000
weighted avg 0.40 0.29 0.29 2000

Category: usage
Confusion Matrix:
[[ 1 0 0 0 0 0 1]
[ 0 52 0 0 2 94 30]
[ 0 0 0 0 0 0 1]
[ 0 0 0 46 1 25 36]
[ 0 0 0 0 123 22 2]
[ 20 68 2 50 168 943 311]
[ 0 0 0 0 0 1 1]]
Classification Report:
precision recall f1-score support

  Travel       0.05      0.50      0.09         2
  Sports       0.43      0.29      0.35       178
   Party       0.00      0.00      0.00         1
  Formal       0.48      0.43      0.45       108
  Ethnic       0.42      0.84      0.56       147
  Casual       0.87      0.60      0.71      1562

Smart Casual 0.00 0.50 0.01 2

accuracy                           0.58      2000

macro avg 0.32 0.45 0.31 2000
weighted avg 0.77 0.58 0.65 2000

Are you going to submit a PR?

  • Yes I'd like to help by submitting a PR!
@azhuvath azhuvath added the bug Something isn't working label Nov 27, 2024
@alexsu52
Copy link
Contributor

@KodiaqQ, please take a look.

@KodiaqQ
Copy link
Collaborator

KodiaqQ commented Dec 4, 2024

Potential fix - #3127
Need to verify an original issue with that PR.

andrey-churkin pushed a commit that referenced this issue Dec 13, 2024
### Changes

- Fixed AAQ algo propagation for quantizable types.

### Reason for changes

- Bugfix.

### Related tickets

- #3118

### Tests

- TBD
@MaximProshin
Copy link
Collaborator

@azhuvath , could you please check if #3127 fixed your problem? It was merged to develop btw.

@azhuvath
Copy link
Author

Closing since a patch is given.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants