Page 1 of 1

M3C2 with large datasets

Posted: Thu Nov 24, 2016 5:33 pm
by ebash
I am working with large point clouds from UAV imagery (approximately 230M points). I want to use the M3C2 plugin to look at small changes between point clouds when different ground control points are used for orienting the imagery. When I try to run the tool, however, it gets stuck on the calculation after a day and I have to force quit. Am I working with point clouds which are too large? Or is there a way to change the settings to make it run more efficiently? Or to split it into chunks and then merge them?

Re: M3C2 with large datasets

Posted: Thu Nov 24, 2016 7:37 pm
by daniel
The issue may be more on the side of parameters: did you use the 'guess' button to get approximate values for the parameters? And does your cloud already have normals or are you letting the plugin compute them? Last but not least, it's generally a good idea to test the results first on a sub-sampled version (see the 'core points' option).

Don't hesitate to post snapshots of your cloud and of the parameters.

Re: M3C2 with large datasets

Posted: Fri Dec 02, 2016 2:55 pm
by ebash
Thanks, somehow I missed your reply... I did use the 'guess' button for the parameters and the plugin is computing the normals. The guessed parameters are using a subsampled cloud already. I will try to post some screen shots tomorrow, right now my computer is tied up with other processing and won't have enough memory to load the clouds.

Re: M3C2 with large datasets

Posted: Mon Dec 05, 2016 6:52 pm
by ebash
Here is a screenshot after guessing the parameters. Is the calculation slowed down more by the number of points, or the density of them? The point cloud is very dense - 350 pts/m2.

https://drive.google.com/file/d/0B3Ul_G ... sp=sharing

Re: M3C2 with large datasets

Posted: Mon Dec 05, 2016 7:34 pm
by daniel
Wow, 400.000 points per cell at level 7?! I wonder why the plugin chose this configuration... It's obviously a glitch. You should reduce a lot the diameter (0.5 seems more than enough, and maybe already too much).

On my side I'll have to investigate this issue. Is it possible for you to share the cloud with me? (if yes you can send me a link to cloudcompare [at] danielgm.net).

Re: M3C2 with large datasets

Posted: Thu Jan 05, 2017 9:54 pm
by ebash
Daniel,
I think the problem with running the plugin lies in my data, I was able to successfully run the plugin on a different dataset of similar size. I am going to continue looking into it after working with the successful product of the second dataset for a while.

I wonder if there is a way to compare the M3C2 distance to actual measured change stored in a shapefile or text file within Cloud Compare. Or to export the M3C2 distance into a format readable in ArcMap? When I try to export a .las file, the scalar fields get ignored.

Re: M3C2 with large datasets

Posted: Fri Jan 06, 2017 7:49 am
by daniel
Indeed, on my side I found that the plugin was overestimating the normal radius in some cases (when you click on the 'Guess parameters' button). I fixed it.

And there's no simple way to compare the M3C2 distances with other distances inside CloudCompare. But you could export them as a raster (with the Rasterize tool). With the 2.8 version you should be able to generate the raster with the M3C2 distances as active 'layer' and then export it to a geotiff file.

P.S.: technically, it's also possible to export any scalar field as the 'Intensity' field of a LAS file. To do this, make sure that no other scalar field is named 'Intensity' in your cloud, and rename your own scalar field 'Intensity' (mind the capital 'I'). But sadly the 'Intensity' field of a LAS file is limited to positive integer values between 0 and 65535, so it's not very practical...).