Skip to content

About the hardwareization of BN and activation functions in Finn architecture #884

Answered by fpjentzsch
sansi-zhang asked this question in Q&A
Discussion options

You must be logged in to vote

Hi,
exactly, BatchNorm, non-linear activation functions, and quantizer scale values will all be collapsed into threshold operations by FINN's streamlining transformations. The threshold can either be implemented as part of an MVAU (Matrix Vector Activation Unit) or as MVU (Matrix Vector Unit) + standalone MultiThreshold. Here is a related thread on how the MultiThreshold works: #914

In general, you would apply the default sequence of FINN transformation steps (from the "build_dataflow") to your model and see where streamlining (and later the conversion to HLS-backend layers) breaks. Then adjust the transformation steps until you arrive at a correct model where all float operations have be…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@sansi-zhang
Comment options

Answer selected by fpjentzsch
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants