Why is Flax Linear not identical to matrix multiplication? #4020
-
Due to the novelty of Flax, NNX, and JAX, there’s not a lot of resources available. I’m running into the following peculiarity:
My understanding is that matrix multiplication should be identical to a linear / fully connected layer. The discrepancy demonstrated here hinders the inspection of certain behavior (and the implementation of invertible dense layers using Does anyone know what causes this discrepancy? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
The answer has been provided on Stackoverflow.
So, to invert a
|
Beta Was this translation helpful? Give feedback.
The answer has been provided on Stackoverflow.
So, to invert a
nnx.Linear
operation usingnnx.tensorsolve
: