[Done] Data Cleanup: Duplicate Point Detection

To post any request/idea for new functionalities
Post Reply
andrewmcunliffe
Posts: 7
Joined: Thu Jan 30, 2014 2:23 pm

[Done] Data Cleanup: Duplicate Point Detection

Post by andrewmcunliffe »

Hi Daniel

Thanks for making your program available, it really is great. I working with TLS-derived clouds of vegetation and it's refreshing to be able to view the data so easily.

Would it be possible to implement a function to detect and remove duplicate points (perhaps within a user specified tolerance) to support data clean-up?

Many Thanks,
Andy
daniel
Site Admin
Posts: 7366
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Data Cleanup: Duplicate Point Detection

Post by daniel »

Indeed there should be a dedicated method to do this.

Meanwhile, you can do it indirectly by computing the cloud 'density' (Tools > Other > Density). For duplicated points, the density is theoretically infinite (in practical there's a maximum value in CC). So you'll just have to remove the points with the highest density values (with "Edit > Scalar Fields > Filter by value").
Daniel, CloudCompare admin
daniel
Site Admin
Posts: 7366
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Data Cleanup: Duplicate Point Detection

Post by daniel »

Hum, I realize that I answered a bit too quickly: if you apply the alternative method I've just proposed, you'll remove all the points lying at the same place.

I've just added a proper method to do this. You can test it with the latest online 'beta' release (http://www.cloudcompare.org/release). You'll find it in "Tools > Other > Remove duplicate points"
Daniel, CloudCompare admin
andrewmcunliffe
Posts: 7
Joined: Thu Jan 30, 2014 2:23 pm

Re: Data Cleanup: Duplicate Point Detection

Post by andrewmcunliffe »

Thanks for adding this functionality.

I've got a query about the default 'Min distance' setting though. I appreciate that CC is unitless, but my experience is that most users are working with clouds in meters; consequently, a default value of 0.000000000001 m seems a little excessive. Unless you're sure you'd rather the tool defaults to removing no points in most applications, perhaps a default setting of 0.001 (i.e., often 1 mm) may be more relevant, given the limitations of the tools used to acquire many of these datasets (beam divergence etc.).

I'd be interested to know you're thoughts on this.
Cheers,
Andy
daniel
Site Admin
Posts: 7366
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Data Cleanup: Duplicate Point Detection

Post by daniel »

On my side we work mainly in millimeters (which is worse regarding the issue you mentioned here ;).

In fact this tool was intended to remove "real" duplicate points (for which coordinates are just slightly different due to numerical errors). Such points are typically encountered in STL files for instance. To remove points too close from each others, the "Subsample" method is better suited (it's almost the same in fact).

But the simpler would be to make CC "remember" about the last input value so that each one can set it once and for all.
Daniel, CloudCompare admin
daniel
Site Admin
Posts: 7366
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Data Cleanup: Duplicate Point Detection

Post by daniel »

Ok that's done (will be effective in the next release).
Daniel, CloudCompare admin
Post Reply