-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Code for getting GT 3d mesh with NEUS #9
Comments
Hi, thanks for your interest in our work :) Please refer to this comment |
Thanks |
Hi, It seems the optimization time for each frame is quite long and mesh quality is not consistent based on my hyper parameters, would it be possible to share your pseudo GT meshes for all frames and all characters as well as the evaluation code for mesh L2 Chamfer Distance(CD) and Normal Consistency (NC)? Thanks! |
Sure, the reconstructions along with ARAH results (renderings and meshes) are stored at this link For evaluation, the script is very messy so I just share it here. I will need to find some time to integrate it into the code base.
Also, on ZJU-MoCap there were some non-negligible calibration errors for sequences beyond 315. This is also observed by the TAVA paper (Appendix D). Please be aware of this when evaluating 3D reconstructions. |
Thanks for your complete reply! |
Yea, for 313 it's because the training set has only 60 frames. For 387, 390, and 392 somehow it's hard to find good frames for reconstruction (by visual inspection), this may be due to the calibration error and the color of the cloth they wear. Eventually, I did not include 387, 390, and 392 for evaluation. Besides I do reconstruction for every 30th frame, similar to the evaluation protocol of novel view synthesis under training poses. Otherwise, the workload of NeuS reconstruction would be too large. For calibration errors, it happens for certain cameras, maybe even including some (1 or more) training cameras, as I consistently observe some misalignment artifacts for novel view synthesis under certain views, for 377 and beyond. You may exclude them from the camera set but that requires redoing the NeuS reconstruction and also potentially, retraining the models. It's also not clear how to find which camera are faulty - doing some RANSAC-like triangulation might help to find those outlier cameras. Or you can do camera pose optimization like was done in the IDR paper to fix cameras. But I didn't try those as I became aware of this issue after the camera-ready deadline... |
Thanks, but the evaluation is conducted on these frames in your code above, right? valid_frames = {'313': [0, 30], |
Yes, table 2 of our paper reports numbers for sequences except 387, 390, and 392 - the number of good frames is too few for them. Otherwise, it should reproduce the numbers reported in the paper. |
Close the issue due to inactivity. |
Hi, thanks for great work!
Would it be possible to release the code to get the GT 3d meshes on zju dataset with NEUS? Thanks!
The text was updated successfully, but these errors were encountered: