Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Order of the points matters #6

Closed
xinshuoweng opened this issue Sep 8, 2020 · 8 comments
Closed

Order of the points matters #6

xinshuoweng opened this issue Sep 8, 2020 · 8 comments

Comments

@xinshuoweng
Copy link

Hi, Thanks a lot for sharing your code here.

I have one question regarding the forward pass of the chamfer distance. In my experiments, I found that reordering the input point clouds will result in different chamfer distance results. Is it normal to happen? Is there some approximation happening in the code?

From what I understand in this paper: A Point Set Generation Network for 3D Object Reconstruction from a Single Image
The chamfer distance is defined agnostic to the ordering of the input points.
Would you mind clarifying this point?

Thanks!

@krrish94
Copy link
Owner

krrish94 commented Sep 8, 2020

Hi @xinshuoweng, thanks for trying this out!

The order of the points shouldn't matter. Maybe the tensors aren't contiguous in memory. Try calling .contiguous() on the input tensors to the chamfer distance module maybe?

@xinshuoweng
Copy link
Author

xinshuoweng commented Sep 8, 2020

Hi @krrish94, thanks for your quick reply!!

As you instructed, I tried contiguous() but it did not help.

Here is how I re-order the points:
np.random.shuffle(choice)
pts_return = pts[:, choice]

where the 'choice' is just a list of indexes from 0 to the total number of points minus 1. When I use 'pts' (1 x 3 x M) and compare it with another GT point cloud (1 x 3 x N), the chamfer distance number is significantly different from feeding 'pts_return' and GT to the chamfer distance function. Any other thoughts regarding this would be greatly appreciated!

@krrish94
Copy link
Owner

krrish94 commented Sep 8, 2020

Ah, I see your dimensions are (3, M) and (3, N). Try transposing them? (Been a while since I looked at this repo, but the README mentions B, N, D ordering (batchsize, number of points, dimensionality of each point))

@xinshuoweng
Copy link
Author

xinshuoweng commented Sep 8, 2020

Sorry that I should make this clear once. I did transpose it in my chamfer distance loss function, which is defined as follows:

class CDLOSS(nn.Module):
    '''
    Calculate Prediction Loss
    Input:
        points1: batch_size * 3 * N1
        points2: batch_size * 3 * N2
    Output:
        chamfer distance (squared distance per point, already averaged over batches) between seq1 and seq2
    '''
    def __init__(self):
        super(CDLOSS, self).__init__()
        self.chamfer = ChamferDistance()
    def forward(self, points1, points2):
        points1 = points1.transpose(2, 1)
        points2 = points2.transpose(2, 1)
        dist1, dist2, _, _ = self.chamfer(points1, points2)
        loss = 0.5 * (dist1.mean() + dist2.mean())
        return loss

It is just a simple wrapper of your function. I am used to the 3 x N format so that I add this wrapper function to your chamfer distance function. So I do not think the dimension is a problem.

@krrish94
Copy link
Owner

krrish94 commented Sep 8, 2020

Thanks for the snippet; this looks weird indeed. I'm assuming you already tried .contiguous() on points1 and points2 on the call to self.chamfer(). Can you confirm if you're able to run the example?

If issues persist, I can try taking a look at this sometime tomorrow.

@xinshuoweng
Copy link
Author

Yes, I did add contiguous() for both point1 and point2 when calling the CDLOSS function. Also, your provided example is perfect and I can run it without any problem. I have been using your code (this chamfer distance function) for a year and see no problem until today with adding random re-ordering on the input point cloud.

Thanks a lot!! I would appreciate it if you can take a look. To give your more context, I am using large-scale point clouds from the KITTI dataset, each point cloud has about 120,000 points. I will also in the meantime run more analysis.

@krrish94
Copy link
Owner

krrish94 commented Sep 8, 2020

Hi @xinshuoweng,

I tried a couple experiments on my end and in short, everything seems to be working okay. Here's a short snippet to help you reproduce results.

import torch
import chamferdist

# Initi chamfer dist module
chamferDist = chamferdist.ChamferDistance()

# Init two pointclouds
a = torch.randn(1, 100, 3).cuda()
b = torch.randn(1, 50, 3).cuda()

# Compute Chamfer distance
d1, d2, i1, i2 = chamferDist(a.contiguous(), b.contiguous())
print(0.5 * (d1.mean() + d2.mean())

# Shuffle pointcloud 'a'
r = torch.randperm(100)
a = a[:, r, :]

# Compute Chamfer distance of shuffled pointcloud
d1_, d2_, i1_, i2_ = chamferDist(a.contiguous(), b.contiguous())
print(0.5 * (d1_.mean() + d2_.mean())

Both the print statements give me the same value. Since the only point of difference across our code snippets is the way you shuffle points, I think that's where you need to be looking at.

Closing this, as it doesn't seem to be an issue with the package. Feel free to re-open if anything!

@krrish94 krrish94 closed this as completed Sep 8, 2020
@xinshuoweng
Copy link
Author

xinshuoweng commented Oct 14, 2020

Hi @krrish94

Sorry I was busy with other things recently and just came back to revisit this again. I finally found the bug. It is a mistake on my side. There is a transpose function I used right before calling the chamferDist function and the transposed point clouds which do not call the contiguous function cause this issue. Once I call contiguous() to the transposed point cloud, then everything works well. Thank you very much!

Maybe one minor suggestion would be to add a call of contiguous inside chamferdist function so that if people forget to call it outside the chamferdist function, it is still safe.

I appreciate your help throughout!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants