Skip to content

Commit

Permalink
Fix a bug of quantization (PaddlePaddle#36982) (PaddlePaddle#37381)
Browse files Browse the repository at this point in the history
* fix a quantization bug

Co-authored-by: XGZhang <[email protected]>
  • Loading branch information
ceci3 and XGZhang11 authored Nov 22, 2021
1 parent 109f8a8 commit 9ffb43b
Showing 1 changed file with 3 additions and 2 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -1292,10 +1292,11 @@ def _insert_post_channel_dequant_op(self, graph, op_node, quant_axis):
var_type=output_var_node.type(),
shape=output_var_node.shape(),
var_dtype=output_var_node.dtype())
x_num_col_dims = 1
if op_node.name() in ['matmul', 'matmul_v2', 'mul']:
x_num_col_dims = len(op_node.outputs[0].shape()) - 1
if op_node.op().has_attr("x_num_col_dims"):
x_num_col_dims = op_node.op().attr("x_num_col_dims")
else:
x_num_col_dims = 1
dequant_op_node = graph.create_op_node(
op_type='fake_channel_wise_dequantize_max_abs',
attrs={
Expand Down

0 comments on commit 9ffb43b

Please sign in to comment.