Optional fusion of convolution and BatchNorm layers during inference #185
Labels
enhancement
New feature or request
layers
Related to the Layers module - generic layers for reuse by models
Convolution and BatchNorm layers have been fused during inference in many models, most notably LeViT. It would be a good idea to have this as an option in the
conv_norm
function inLayers
with a dispatch specifically for BatchNorm.The text was updated successfully, but these errors were encountered: