-
-
Notifications
You must be signed in to change notification settings - Fork 911
Usage
Next an usage example of the available modules is presented. For this we used the Sceaux Castle images and OpenMVG pipeline to recover camera positions and the sparse point-cloud. Please note that all output presented here is the original output obtained automatically by the OpenMVS pipeline, with no manual manipulation of the results. The complete example (including Windows x64 binary for the modules) can be found at OpenMVS_sample.
All OpenMVS binaries support some command line parameters, which are explained in detail if executed with no parameters or with -h
.
@FlachyJoe contributed with a script which automates the process of running OpenMVG and OpenMVS in a single command line. Same results as bellow can be obtained by running:
python MvgMvs_Pipeline.py <images_folder> <output_folder>
After reconstructing the scene, OpenMVG will generate by default the sfm_data.bin
file containing camera poses and the sparse point-cloud. Using the exporter tool provided by OpenMVG, we convert it to the OpenMVS project scene.mvs
:
openMVG_main_openMVG2openMVS -i sfm_data.bin -o scene.mvs -d scene_undistorted_images
The directory made with the -d switch will store the undistorted images.
To import a scene reconstructed by OpenMVG using the old .json
ASCII format run the following:
openMVG_main_openMVG2openMVS -i scene.json -o scene.mvs
A typical sparse point-cloud obtained by the previous steps will look like this:
If there are missing scene parts, the dense reconstruction module can recover them by estimating a dense point-cloud:
DensifyPointCloud scene.mvs
The obtained dense point-cloud (please note the vertex colors are roughly estimated only for visualization, they do not contribute farther down the pipeline):
The sparse or dense point-cloud obtained in the previous steps is used as the input of the mesh reconstruction module:
ReconstructMesh scene_dense.mvs
The obtained mesh:
The mesh obtained either from the sparse or dense point-cloud can be further refined to recover all fine details or even bigger missing parts. Next the rough mesh obtained only from the sparse point-cloud is refined:
RefineMesh scene_mesh.mvs
The mesh before and after refinement:
The mesh obtained in the previous steps is used as the input of the mesh texturing module:
TextureMesh scene_dense_mesh.mvs
The obtained mesh plus texture:
Each of the above commands also writes a .ply file that can be used with many third-party tools.
MVS comes with a viewer program, named Viewer, which can view any .mvs file created by openMVS, and also .ply and .obj files. The viewer expects the input file either on the command line or to drag and drop it after it is started.