You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After reproduced and looked into #73 , I have found the issue arises from a BoundReLU module with output shape [..., 50, 40] bound-backwarded with patches last_lA and last_uA with parameters stride=2, padding=3, height=25, width=20, kernel_size=[9,9] at bound_backward(.). As a result an image shaped[..., 50, 40], going through the stride=2, padding=3, kernel_size=[9,9] "convolution"(patch) operator, will therefore yield a new output shaped $h=\lfloor {50 + 3\times2 - (9 - 1) - 1 \over 2} + 1 \rfloor = 24, w=\lfloor {40 + 3\times2 - (9 - 1) - 1 \over 2} + 1 \rfloor = 19$ in inplace_unfold function, inconsistent with the expected shape [25,20]. Thus the consequent shape-error occurs. The problem is due to incorrect parameters of lA on 13:23:45 line of the following log. Actually I thought the "kernel size" of lA on 13:23:45 line should be 7 instead of 9.
DEBUG 13:23:20 Bound backward from BoundBatchNormalization(/input.16) to bound BoundBatchNormalization(/input.16)
DEBUG 13:23:20 C: shape [64, 4, 25, 20, 64, 1, 1], type <class 'auto_LiRPA.patches.Patches'>
DEBUG 13:23:21 Bound backward to BoundBatchNormalization(name=/input.16, inputs=[/input.12, /8, /9, /10, /11], perturbed=True) (out shape torch.Size([4, 64, 25, 20]))
DEBUG 13:23:21 lA type <class 'auto_LiRPA.patches.Patches'> shape [64, 4, 25, 20, 64, 1, 1]
DEBUG 13:23:21 uA type <class 'auto_LiRPA.patches.Patches'> shape [64, 4, 25, 20, 64, 1, 1]
DEBUG 13:23:21 lA padding 0, stride 1, inserted_zeros 0
DEBUG 13:23:23 Bound backward to BoundConv(name=/input.12, inputs=[/input.8, /7], perturbed=True) (out shape torch.Size([4, 64, 25, 20]))
DEBUG 13:23:23 lA type <class 'auto_LiRPA.patches.Patches'> shape [64, 4, 25, 20, 64, 1, 1]
DEBUG 13:23:23 uA type <class 'auto_LiRPA.patches.Patches'> shape [64, 4, 25, 20, 64, 1, 1]
DEBUG 13:23:23 lA padding 0, stride 1, inserted_zeros 0
DEBUG 13:23:27 Bound backward to BoundAveragePool(name=/input.8, inputs=[/139], perturbed=True) (out shape torch.Size([4, 64, 25, 20]))
DEBUG 13:23:27 lA type <class 'auto_LiRPA.patches.Patches'> shape [64, 4, 25, 20, 64, 3, 3]
DEBUG 13:23:27 uA type <class 'auto_LiRPA.patches.Patches'> shape [64, 4, 25, 20, 64, 3, 3]
DEBUG 13:23:27 lA padding 1, stride 1, inserted_zeros 0
DEBUG 13:23:45 Bound backward to BoundRelu(name=/139, inputs=[/input.4], perturbed=True) (out shape torch.Size([4, 64, 50, 40]))
DEBUG 13:23:45 lA type <class 'auto_LiRPA.patches.Patches'> shape [64, 4, 25, 20, 64, 9, 9]
DEBUG 13:23:45 uA type <class 'auto_LiRPA.patches.Patches'> shape [64, 4, 25, 20, 64, 9, 9]
DEBUG 13:23:45 lA padding 3, stride 2, inserted_zeros 0
where the parameters are BoundConv=[striding=2, padding=1, kernel_size=3], BoundAveragePool=[striding=1, padding=1, kernel_size=3]. Think you are going to fuse two convolution-alike operators, then you actually get a new convolution operator with paremeters [string=2, padding=3, kernel_size=7] instead of kernel_size=9. According to my calculations, assume fusing $conv_2(conv_1(\cdot))$ into $conv$, the new convolution operator will have the parameter: $K=K_1 + (K_2-1)S_1, S=S_1 S_2, P=S_1P_2 + P_1$
but now the effect of the code is like $K=K_1K_2,S=S_1S_2,P=S_1P_2+P_1$ implicitly computed by conv_transpose2d.
(I hope someone discussed with me)
(I omit inserted_zeros because I thought it always be zero after skimming the code and running the test)
It wired because there shouldn't be such bug because auto_LiRPA has gone through many tests, so please educate me if I miss something important.
Just to confirm, If I understand the design of Patch class correctly, the Patch class with the shape [..., c, h, w, K1, K2] and parameters padding, stride, inserted_zeros(=0) represents linear relation over the input entries for each entry in the $c\times h\times w$ space, in which each output entry was computed by inner-product of the K1 x K2 kernel and a square subset from the input. And the position of square from the input is determined by padding, stride, and the position of that entry like the convolution operator.
The text was updated successfully, but these errors were encountered:
After reproduced and looked into #73 , I have found the issue arises from a$h=\lfloor {50 + 3\times2 - (9 - 1) - 1 \over 2} + 1 \rfloor = 24, w=\lfloor {40 + 3\times2 - (9 - 1) - 1 \over 2} + 1 \rfloor = 19$ in
BoundReLU
module with output shape[..., 50, 40]
bound-backwarded with patcheslast_lA
andlast_uA
with parametersstride=2, padding=3, height=25, width=20, kernel_size=[9,9]
atbound_backward(.)
. As a result an image shaped[..., 50, 40]
, going through thestride=2, padding=3, kernel_size=[9,9]
"convolution"(patch) operator, will therefore yield a new output shapedinplace_unfold
function, inconsistent with the expected shape[25,20]
. Thus the consequent shape-error occurs. The problem is due to incorrect parameters oflA
on13:23:45
line of the following log. Actually I thought the "kernel size" oflA
on13:23:45
line should be 7 instead of 9.where the parameters are$conv_2(conv_1(\cdot))$ into $conv$ , the new convolution operator will have the parameter:
$K=K_1 + (K_2-1)S_1, S=S_1 S_2, P=S_1P_2 + P_1$ $K=K_1K_2,S=S_1S_2,P=S_1P_2+P_1$ implicitly computed by
BoundConv=[striding=2, padding=1, kernel_size=3]
,BoundAveragePool=[striding=1, padding=1, kernel_size=3]
. Think you are going to fuse two convolution-alike operators, then you actually get a new convolution operator with paremeters[string=2, padding=3, kernel_size=7]
instead ofkernel_size=9
. According to my calculations, assume fusingbut now the effect of the code is like
conv_transpose2d
.(I hope someone discussed with me)
(I omit
inserted_zeros
because I thought it always be zero after skimming the code and running the test)It wired because there shouldn't be such bug because auto_LiRPA has gone through many tests, so please educate me if I miss something important.
The text was updated successfully, but these errors were encountered: