Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bad geometry and texture map visualization #2

Open
ghy0324 opened this issue Dec 21, 2021 · 7 comments
Open

Bad geometry and texture map visualization #2

ghy0324 opened this issue Dec 21, 2021 · 7 comments

Comments

@ghy0324
Copy link

ghy0324 commented Dec 21, 2021

Hi, thanks for the amazing work!
I train the network on scan 114 data and got accurate rendering results like this:
step-01000000-ray_color
However, when I run visualize_nerf_atlas_radiance.py to visualize the geometry and texture map, I found they are strange.
point cloud:
image
mesh:
image
texture map:
cube_view_63
sphere_view_63

@huizhang0110
Copy link

I have the same questions. It seems visualize_nerf_atlas_radiance.py can not reproduce the original appearance of 114. Would you mind providing more details about how to run visualize_nerf_atlas_radiance?

@fbxiang
Copy link
Owner

fbxiang commented Dec 28, 2021

The visualize_nerf_atlas_radiance.py will not reproduce the original appearance since volumetric rendering is still needed. However, the texture maps do not seem right (it should be similar to the ones shown in the paper). Could you send me the trained model that produces accurate rendering but incorrect textures?

@huizhang0110
Copy link

huizhang0110 commented Dec 28, 2021

Hi, @fbxiang, thanks for your kind reply.
Following the instruction of README, I trained NeuTex from scratch on the scene of DTU/scan114. But texture map and mesh from visualize_nerf_atlas_radiance.py seem very strange. You can download the logs files from the following link:
https://drive.google.com/drive/folders/16JgNqxIrb1z7HNN8s0ijC3JC18C6xWzh?usp=sharing

@Auggst
Copy link

Auggst commented Feb 21, 2022

Was the question solved?

@fbxiang
Copy link
Owner

fbxiang commented Feb 22, 2022

After some investigation, I cannot reproduce @ghy0324 's issue. So I guess maybe the visualization script is not correctly used. However, it can be used by simply replacing the python filename in the shell script. I have pushed a visualization script to do the visualization. After 80000 steps (which is not well trained), I got the following mesh and texture, which look reasonable.
image
cube_view_14

@huizhang0110 's issue seems very different. I believe the geometry training got stuck in by visualizing the mesh. I was able to reproduce this issue by repeatedly running the script for many times. The main reason is that this repository does not contain the code for pretraining to fit the point cloud as described in the paper (that part of the code contains custom kernels and is a bit hard to run out-of-the-box). I may not be able to integrate it soon since I am not working on this topic now. For now, maybe simply re-train it would resolve the issue.

@ybbbbt
Copy link

ybbbbt commented Feb 25, 2022

Hi, @fbxiang , thank you for sharing the code.
We run the DTU scan_114 with the default setting and dataset in this repo, but the rendered image does not seems to converge well, even after training for 500000 iterations. Can you give us some advices?

step-00500000-ray_color

@fbxiang
Copy link
Owner

fbxiang commented Feb 25, 2022

The most important parameter to tune is changing sample_num from 64 to 256 (that is the setting in the paper, but you need large GPU memory for it), and I recommend increasing/decreasing random_sample_size to fill your GPU memory for faster training.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants