Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data disappears from point cloud when exporting at low decimation value #1368

Open
torjusti opened this issue Oct 29, 2024 · 8 comments
Open

Comments

@torjusti
Copy link

torjusti commented Oct 29, 2024

Hello!

I've got a pretty huge scan (the DB file is almost 70 GB large) which I'm generating point clouds of using rtabmap-databaseViewer.

What I find is that when exporting with decimation level 8 and no normal generation, I get all the data I want. If I export at lower decimation levels (4, 2, 1) I get progressively less and less data. With that I mean that portions of the scan disappear, seemingly at random and not related to when in time they were scanned.

Also, if I generate a scan with decimation level 8 and normal computation enabled, the entire thing disappears. The PLY is still large, so there should be data, but I can't see anything when opening the file in a point cloud viewer. I also see that the coordinate bounds of the scan reported by CloudCompare goes to orders of magnitude larger values than they normally would (10e36 is what I recall, although I don't have a broken file handy right now).

Not sure what could be causing this - is there a known upper limit to the size of point cloud that can be exported? Potential upstream issue? Any help greatly appreciated :)

@matlabbe
Copy link
Member

How large is the area in meters? are you also setting a voxel size? The voxel filter has some limit, though we implemented a workaround to split space in order to apply the voxel filter (it should still work, only with some warnings).

With a lot of points, that could be possible that you run out of RAM. You may keep an eye on RAM usage while it is exporting. A trick with large maps is to split the output in multiple exports, using this option:
image

With rtabmap-export CLI tool, the corresponding options are:

    --xmin #              Minimum range on X axis to keep nodes to export.
    --xmax #              Maximum range on X axis to keep nodes to export.
    --ymin #              Minimum range on Y axis to keep nodes to export.
    --ymax #              Maximum range on Y axis to keep nodes to export.
    --zmin #              Minimum range on Z axis to keep nodes to export.
    --zmax #              Maximum range on Z axis to keep nodes to export.

cheers,
Mathieu

@torjusti
Copy link
Author

torjusti commented Nov 1, 2024

The area is approximately 275 * 200 meters in area (not sure how to find surface area if that is what you mean). Forgot to mention it, but I don't voxelize as that adds more time to the export. I also don't think I'm running out of RAM.

Cutting up the scan in different areas and exporting them separately is a good idea thanks, that might be a good workaround.

@matlabbe
Copy link
Member

matlabbe commented Nov 1, 2024

I also don't think I'm running out of RAM

I am then not sure what is happening. Out of curiosity, can you copy/paste output of:

rtabmap-info --diff rtabmap.db

For that area size, 70 GB is a lot, I am curious about how much space the raw data is taken up from in the database versus the number of nodes in the map.

@torjusti
Copy link
Author

torjusti commented Nov 1, 2024

➜  VDG rtabmap-info --diff merged-2.0.db 
Parameters (Yellow=modified, Red=old parameter not used anymore, NA=not in database):
BRIEF/Bytes=                       64  (default=32)
Db/TargetVersion=                  0.21.1  (default=)
GFTT/QualityLevel=                 0.0001  (default=0.001)
GridGlobal/FullUpdate=             true  (default=NA)
Icp/CorrespondenceRatio=           0.49  (default=0.1)
Icp/Epsilon=                       0.001  (default=0)
Icp/Iterations=                    10  (default=30)
Icp/MaxRotation=                   0.17  (default=0.78)
Icp/MaxTranslation=                0.05  (default=0.2)
Icp/PointToPlane=                  true  (default=false)
Kp/MaxFeatures=                    750  (default=500)
Kp/Parallelized=                   false  (default=true)
Marker/CornerRefinementMethod=     3  (default=0)
Marker/Dictionary=                 20  (default=0)
Marker/Length=                     0.100000  (default=0)
Marker/MaxDepthError=              0.04  (default=0.01)
Mem/CompressionParallelized=       false  (default=true)
Mem/ImagePreDecimation=            2  (default=1)
Mem/RehearsalSimilarity=           0.3  (default=0.6)
Mem/STMSize=                       20  (default=10)
Mem/UseOdomFeatures=               false  (default=true)
Mem/UseOdomGravity=                true  (default=false)
Optimizer/GravitySigma=            0.2  (default=0.3)
Optimizer/Iterations=              10  (default=20)
RGBD/AngularUpdate=                0.05  (default=0.1)
RGBD/LinearUpdate=                 0.05  (default=0.1)
RGBD/MarkerDetection=              true  (default=false)
RGBD/MaxLocalRetrieved=            0  (default=2)
RGBD/OptimizeFromGraphEnd=         true  (default=false)
RGBD/OptimizeMaxError=             8  (default=3.0)
Rtabmap/DetectionRate=             2  (default=1)
Rtabmap/MaxRetrieved=              1  (default=2)
Rtabmap/MemoryThr=                 2000  (default=0)
Rtabmap/PublishLikelihood=         false  (default=true)
Rtabmap/PublishPdf=                false  (default=true)
Vis/MaxFeatures=                   400  (default=1000)
Vis/MinInliers=                    25  (default=20)
Icp/DebugExportFormat=             NA  (default="")
Icp/PMConfig=                      NA  (default="")
Kp/DictionaryPath=                 NA  (default="")
Marker/Priors=                     NA  (default="")
OdomOKVIS/ConfigPath=              NA  (default="")
OdomORBSLAM/VocPath=               NA  (default="")
OdomOpenVINS/LeftMaskPath=         NA  (default="")
OdomOpenVINS/RightMaskPath=        NA  (default="")
OdomVINS/ConfigPath=               NA  (default="")
PyDescriptor/Path=                 NA  (default="")
PyDetector/Path=                   NA  (default="")
PyMatcher/Path=                    NA  (default="")
Rtabmap/WorkingDirectory=          NA  (default="")
SuperPoint/ModelPath=              NA  (default="")

Info:

Path:               merged-2.0.db
Version:            0.21.6
Sessions:           63
Total odometry length:22863.091797 m
Total time:         44156.694827s
LTM:                91718 nodes and 52352894 words (dim=64 type=8U)
WM:                 2018 nodes and 1190259 words
Global graph:       91718 poses and 396430 links
Optimized graph:    0 poses (x=396430->0, y=0->0, z=0->0)
Maps in graph:      0/63 []
Ground truth:       0 poses
GPS:                0 poses
Links:
  Neighbor:         85859	(length avg: 0.27m, std: 0.20m, max: 10.50m)
  GlobalClosure:    3219	(length avg: 0.36m, std: 0.37m, max: 4.34m)
  LocalSpaceClosure:3354	(length avg: 0.32m, std: 0.22m, max: 1.00m)
  LocalTimeClosure: 0	(length avg: 0.00m, std: 0.00m, max: 0.00m)
  UserClosure:      59703	(length avg: 0.34m, std: 0.27m, max: 3.82m)
  VirtualClosure:   0	(length avg: 0.00m, std: 0.00m, max: 0.00m)
  NeighborMerged:   0	(length avg: 0.00m, std: 0.00m, max: 0.00m)
  PosePrior:        0	(length avg: 0.00m, std: 0.00m, max: 0.00m)
  Landmark:         442	(length avg: 1.12m, std: 0.35m, max: 3.24m)
  Gravity:          91718	(length avg: 0.00m, std: 0.00m, max: 0.00m)

Database size:      69503 MB
Nodes size:         17 MB	(0.02%)
Links size:         138 MB	(0.20%)
RGB Images size:    48745 MB	(70.13%)
Depth Images size:  5772 MB	(8.30%)
Calibrations size:  15 MB	(0.02%)
Grids size:         1 MB	(0.00%)
Scans size:         165 MB	(0.24%)
User data size:     0 Bytes	(0.00%)
Dictionary size:    4857 MB	(6.99%)
Features size:      9035 MB	(13.00%)
Statistics size:    115 MB	(0.17%)
Other (indexing, unused):640 MB	(0.92%)

The DB consists of several scans merged together with significant overlap, reprocessed with a higher than default number of features extracted per image. I guess that all contributes to the increased size.

@matlabbe
Copy link
Member

matlabbe commented Nov 2, 2024

Thx for the info, I see. 63 mapping sessions merged together. In case RGB resolution doesn't matter too much in your case, you could set also Mem/ImagePostDecimation=2 when re-reprocessing, that would save some RGB space (of the 48745 MB).

For the original issue, did you try enabling Nodes Filtering? Maybe for some reason in the data you have there are points that are very far (because of poor depth estimation), you could try setting a maximum depth distance when exporting. That would filter these points.

@torjusti
Copy link
Author

torjusti commented Nov 2, 2024

63 mapping sessions merged together

Yes hehe, quite a lot of work :)

I use 4 meters as the max depth accepted when exporting, the problem persists.

@matlabbe
Copy link
Member

matlabbe commented Nov 2, 2024

Well, in general I would say if you can share the database I would take a look to reproduce the issue here, however, 70 GB may be too large for you to do so. I don't know if you could reproduce the issue on one of the session (or by merging some of them) and only share that database (you can send me a private googledrive link if the data cannot be shared publicly, my gmail account is the same than my username here), that would be useful.

@matlabbe
Copy link
Member

matlabbe commented Nov 5, 2024

I created some issues above to test more your data.

In the mean time, I saw in the dataset that sometime there are traversals of the same path on different sessions but they are not connected together, so there are many double-walls in the map. That can be fixed by manually add new constraints with the "Constraints View" of rtabmap-databaseViewer. You may also set RGBD/OptimizeMaxError to avoid some [wrong] rejections.
2024-11-04_20-30

Here an example where I fixed two paths by adding a loop closure on some symbols on other wall seen by two point views from two different sessions:
Screenshot from 2024-11-04 20-30-02
Screenshot from 2024-11-04 20-32-16

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants