Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possible bug in number of tensors for layer fusion #60

Open
TheOnlyError opened this issue Jul 31, 2022 · 1 comment
Open

Possible bug in number of tensors for layer fusion #60

TheOnlyError opened this issue Jul 31, 2022 · 1 comment

Comments

@TheOnlyError
Copy link

The figure below shows that the value of a decoder neuron is based on 5 other values:

image

When trying to reproduce this in the code, I get the following:

image

The number of tensors is equal to 4 instead of 5. This suggest there is a bug in the code around the following lines:

for i in range(depth_decode):
f = filter_num_skip[i]
# collecting tensors for layer fusion
X_fscale = []
# for each upsampling level, loop over all available downsampling levels (similar to the unet++)
for lev in range(depth_decode):

I think the second for loop should have depth_decode+1 as the upper boundary in the range. Is this indeed a bug?

@ziyangwang007
Copy link

I also noticed that the X1EN can not be copied and crop to the decoder part through multi-scale skip connections each time. I agree with you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants