-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Backbone of FBA. #215
Backbone of FBA. #215
Conversation
Codecov Report
@@ Coverage Diff @@
## master #215 +/- ##
==========================================
- Coverage 82.27% 81.71% -0.57%
==========================================
Files 145 148 +3
Lines 6732 7055 +323
Branches 1004 1047 +43
==========================================
+ Hits 5539 5765 +226
- Misses 1086 1171 +85
- Partials 107 119 +12
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
batch_norm=False, | ||
norm_cfg=None, | ||
act_cfg=dict(type='ReLU')): | ||
super(FBADecoder, self).__init__() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
super().__init__()
x = torch.cat((x, conv_out[-6][:, :3], img, two_chan_trimap), 1) | ||
|
||
output = self.conv_up4(x) | ||
alpha = torch.clamp(output[:, 0][:, None], 0, 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[:, 0][:, None]
-> [:, 0:1]
@@ -0,0 +1,119 @@ | |||
import torch |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Give credit if the code is modified from the original code
norm(256, self.batch_norm), nn.LeakyReLU()) | ||
|
||
# Keep the batch_norm here in case we may need to modify something | ||
if (self.batch_norm): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
superfluous brackets
ConvWS2d(256 + 256, 256, kernel_size=3, padding=1, bias=True), | ||
norm(256, self.batch_norm), nn.LeakyReLU()) | ||
|
||
# Keep the batch_norm here in case we may need to modify something |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in case we may need to modify something
What does this line of comment mean
nn.Conv2d(16, 7, kernel_size=1, padding=0, bias=True)) | ||
|
||
def init_weights(self, pretrained=None): | ||
pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this pass
intended?
def forward(self, inputs): | ||
"""Forward function. | ||
Args: | ||
inputs (dict): Output dict of FbaEncoder. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
enocder or decoder?
|
||
conv_out = inputs['conv_out'] | ||
img = inputs['merged'] | ||
two_chan_trimap = inputs['two_chan_trimap'] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
two_chan_trimap
-> two_channel_trimap
nn.Sequential( | ||
nn.AdaptiveAvgPool2d(scale), | ||
ConvWS2d(2048, 256, kernel_size=1, bias=True), | ||
norm(256, self.batch_norm), nn.LeakyReLU())) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why norm_cfg and act_cfg are needed if all are hard-coded as batch_norm and LeakyReLU
|
||
|
||
def norm(dim, bn=False): | ||
if (bn): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove brackets
return out | ||
|
||
|
||
class ResNet(nn.Module): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ResNet -> ResNetFBA
} | ||
|
||
|
||
def l_resnet50(pretrained=None, **kwargs): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The function name is cryptic
* Backbone of FBA. * Init. * Doc string for forward. * Doc string of Init. * Modified API. * FBAencoder. * Tiny. * Doc string. * Decoder. * Init. * General Res. * Added test. * Added tests. * Added two_channel_trimap key. * Tiny. * Restore shape. * Tiny. * Added utis. May change. * Added FBA mattor. * Improved. * Tiny. * Modified. * Tiny. * Tiny. * Tiny. * Delete plugins. * Modified. * Tiny. * Mattor. * Tiny. * Tiny. * Postponed to next branch. * Tiny. * Update base_mattor.py Co-authored-by: lizz <[email protected]>
No description provided.