You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
Hi there, we found that the current implementation of batch norm layer does support grad_req = add. If grad_req is set to add, the gradient of input data is not accumulated. Besides the gradient of gamma and beta are not assigned to any value by mistake.
The several values are the gradients of the input data, gamma, beta individually. The gradients are wrong.
Environment
mxnet-2.0.0b20200421 installed by pip
I could not run the latest version(mxnet-2.0.0b20200516) of MXNet 2.0 on my laptop since libopenblas.so.0 is not found : (
----------Python Info----------
Version : 3.8.3
Compiler : GCC 10.1.0
Build : ('default', 'May 17 2020 18:15:42')
Arch : ('64bit', 'ELF')
------------Pip Info-----------
Version : 20.0.2
Directory : /usr/lib/python3.8/site-packages/pip
----------MXNet Info-----------
Version : 2.0.0
Directory : /usr/lib/python3.8/site-packages/mxnet
Hashtag not found. Not installed from pre-built package.
----------System Info----------
Platform : Linux-5.6.15-arch1-1-x86_64-with-glibc2.2.5
system : Linux
node : MiraiT
release : 5.6.15-arch1-1
version : #1 SMP PREEMPT Wed, 27 May 2020 23:42:26 +0000
----------Hardware Info----------
machine : x86_64
processor :
Architecture: x86_64
CPU op-mode(s): 32-bit, 64-bit
Byte Order: Little Endian
The text was updated successfully, but these errors were encountered:
wkcn
changed the title
[BUG] The wrong gradient of Batch Norm when grad_req = True
[BUG] The wrong gradient of Batch Norm when grad_req = addJun 5, 2020
Hi @sxjscience , I got the same result as mxnet-2.0.0b20200421 with 1.7. As it's a functionality bug exposed before the final release, I do suggest to include this fix for 1.7 as well. @wkcn would you mind to help backport the fix to 1.7.x and 1.x when it's merged to master? Thanks!
Description
Hi there, we found that the current implementation of batch norm layer does support
grad_req = add
. Ifgrad_req
is set toadd
, the gradient of input data is not accumulated. Besides the gradient of gamma and beta are not assigned to any value by mistake.To Reproduce
It outputs the following message:
mxnet-2.0.0b20200421 installed by pip
MXNet 1.6 installed by
pip --pre
The correct result should be
The several values are the gradients of the input data, gamma, beta individually. The gradients are wrong.
Environment
mxnet-2.0.0b20200421 installed by pip
I could not run the latest version(mxnet-2.0.0b20200516) of MXNet 2.0 on my laptop since
libopenblas.so.0
is not found : (The text was updated successfully, but these errors were encountered: