Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loss.backwards() error #35

Open
anqin1117 opened this issue May 27, 2022 · 0 comments
Open

loss.backwards() error #35

anqin1117 opened this issue May 27, 2022 · 0 comments

Comments

@anqin1117
Copy link

anqin1117 commented May 27, 2022

File "/homes/qinan/miniconda3/envs/3DUNet/lib/python3.6/site-packages/topologylayer/functional/sublevel.py", line 45, in backward
return None, grad_f.view(retshape).to(device), None, None
RuntimeError: shape '[256, 256, 96]' is invalid for input of size 58

sorry , when I use the loss function have the above error. Could you please give me some advices?
I change something to fit my model, the following is my code. But we I use the loss function in my train.py, loss.backward() have the above error.

def topological_loss(recon_x):
batch_size = recon_x.size(0)
b01, b0, b1, b2 = 0., 0., 0., 0.
cpx = init_tri_complex_3d(10, 10, 10)
layer = LevelSetLayer(cpx, maxdim=2, sublevel=False)
f01 = TopKBarcodeLengths(dim=0, k=1)
f0 = PartialSumBarcodeLengths(dim=0, skip=1)
f1 = SumBarcodeLengths(dim=1)
f2 = SumBarcodeLengths(dim=2)
for i in range(batch_size):
dgminfo = layer(recon_x.view(batch_size, patch_side, patch_side, patch_side)[i])
b01 += ((1. - f01(dgminfo))).sum()
b0 += (f0(dgminfo)).sum()
b1 += (f1(dgminfo)).sum()
b2 += (f2(dgminfo)).sum()
b01 = b01.div(batch_size)
b0 = b0.div(batch_size)
b1 = b1.div(batch_size)
b2 = b2.div(batch_size)
topo = b01 + b0 + b1 + b2
return topo

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant