Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to adjust the dense point cloud filtering #911

Closed
julianrendell opened this issue May 23, 2020 · 16 comments
Closed

How to adjust the dense point cloud filtering #911

julianrendell opened this issue May 23, 2020 · 16 comments
Labels
stale for issues that becomes stale (no solution) type:question

Comments

@julianrendell
Copy link

julianrendell commented May 23, 2020

I ran both Meshroom (latest source) and Agisoft Metashape (trial) against the monstree full dataset.

In the viewer, they both have amazing results.

Meshlab picked up that holes at the top-left of the knot hole! Metashape has made it black textured geometry.

But Metashape has done a better job of excluding outliers/focusing on the tree trunk in the mesh reconstruction. In Meshroom, both sides of the trunk have "little wings" which look to be heading towards two sparse clusters of points at both edges of the dense point cloud. (Note: Metashape also appears to have these two sparse clusters in it's point cloud, but they appear to be filtered out of the dense point cloud.)

Would these "wings" be gone if the "not-related" points were filtered out?

What options should I tweak to try and filter these?

Is it possible to load the dense point cloud in say meshlab and manually delete this noise? (And is it as simple as just saving the edit and continuing with the processing in Meshroom?)

Or am I focusing on the wrong stage- is there somewhere else in the pipeline I should be making adjustments?

Thanks in advance!

@NexTechAR-Scott
Copy link

Meshlab won't import Alembic, the format for the point cloud.

Blender will but as Alembic is a scene you'll get all the cameras as well.

Bit messy to clean up.

But none of that matters much as I've never been able to generate a new Alembic outside of Meshroom that works when brought back in.

Pics of your issue would help with suggestions on what to tweak.

@natowi
Copy link
Member

natowi commented May 23, 2020

You could use DepthMapFilter to mask out unwanted areas.

Experimental:
You can also convert the abc file to ply using ConvertSFMformat. You need to copy the latest ConvertSFMformat.py file from github into Meshroom-2019.2.0\lib\meshroom\nodes\aliceVision and remove ConvertSfMFormat.pyc. Now you can select as "unknown" as describer type. Copy the path from the abc to the ConvertSfMFormat input. When you are ready you should be able to reconvert the file to abc

@julianrendell
Copy link
Author

Awesome - thanks!

I’m running trunk, so can give that a try.

Been reading all the docs again and some of it is making more sense. There was a setting on one of the main nodes re using the angle (larger == closer) as a discriminator. I’ll try that first.

@natowi I’d like to try and help with the docs, but the subject is intimidating; what’s the best way to discuss some starting points? Open an issue on the docs repo, or is there a google group? And a de-rail, is it better to ask general questions here or is the group preferred? I’m unclear which is the appropriate place (but the google group seems noisier.) I have some questions re general approaches for a couple of scenarios that may be common starting points. Would be willing to write up the final info as a guide for the docs.

@fabiencastan
Copy link
Member

Hi @julianrendell,
That would be great. If you are interested, we could setup a confcall this week with @natowi to discuss how to organize things.

@natowi
Copy link
Member

natowi commented May 23, 2020

That is a good idea. This would also be a good place to discuss your other ideas @julianrendell

@julianrendell
Copy link
Author

julianrendell commented May 23, 2020

@NexTechAR-Scott here's the Metashape result:
image

and here's the Meshroom result:

image

Meshroom has picked up more of the ground, but that's a trivial fix. But I'm not sure what's going on with the "wings"/"ears" - extra mesh to the sides at the top.

@julianrendell
Copy link
Author

@fabiencastan @natowi be happy to chat re where you need help and if I can be of assistance. I'm in the PST timezone.

@julianrendell
Copy link
Author

Update - improvements and losses...

Here's the results of the default pipeline. Second branch is for "experiments" below. (BTW- now that I'm starting to understand the pipeline, really appreciating it. Great design choice!)

I've oriented it to show the object of interest as well as the SFM point cloud. You can see the "ears."

image

Here's the best I've managed so far:

image

First I tried modifying the DepthMapFilter node- ended up with min view angle =10 (max.) It removed only a little bit of the "ears".

Then I repeated a few times looking at the DepthMap node- adjusting min view angle to 10 (max) here removed ~25% of them. Much more effect than the DepthMapFilter.

The most effect was changing the SFM options: Min observation for triangulation ->4, and Min angle for Triangulation -> 8. Even smaller changes really cleaned up the point cloud in areas away from the subject of interest. But now I'm starting to loose the trunk, and still have small some small artifacts at the top.

Here's a close up of the point cloud looking down from the top:

image

And the mesh as a solid:

image

It looks to me like there is a "fold" in the mesh at the edge of the subject of interest, and some sort of error has accumulated in the meshing stage. There doesn't seem to be any points in the point cloud around this area.

@natowi @fabiencastan @NexTechAR-Scott I'm guessing that these remnants are really coming from the meshing process- maybe the Min observations from SFM space estimation, or Min observable angle for SFM space observation? Can someone explain what these parameters control?

Appreciate your feedback- feel like I'm starting to get a feel for how to set this up as a serious of experiments with each step of the pipeline. When I get my head around this a bit better and figure out the renderfarm I think I'll be ready to add this as a module to the CAD/CAM classes I sometimes teach.

@simogasp
Copy link
Member

maybe you can also try to play with the MeshFiltering node, and reduce the number of larger triangles (Filter Large Triangles Factor) and keep only the largest mesh to remove unconnected parts

@NexTechAR-Scott
Copy link

SFM angles are essentially bounding box.

Think of 0 as looking straight ahead meaning everything in the background of the subject would get rendered

90 would effectively be looking straight down on the object

The higher the angle the “smaller” the bounding box for reconstruction

Obviously too high an angle can be as bad as too low

40 is a reasonable compromise when shooting int the wild

If using a light box the default is fine

I’ve got a few settings I can post for
you later when I get home

@julianrendell
Copy link
Author

Thanks @simogasp! Already selected keep only largest mesh; these 'ears' are connected. Will try the large triangles setting.

@julianrendell
Copy link
Author

Thanks @NexTechAR-Scott ! That’s a good explanation. Just realizing what it can mean to be able to tweak these angles at each stage.

If you’d be willing to share a couple of pipelines (outdoor vs light box) that’d be really appreciated. I have personal and local community projects at both scales. (One of my sons is blender mad right now- itching to try the blender plugin and see what we can come up with!)

@NexTechAR-Scott
Copy link

NexTechAR-Scott commented May 24, 2020

If you’d be willing to share a couple of pipelines (outdoor vs light box) that’d be really appreciated. I have personal and local community projects at both scales. (One of my sons is blender mad right now- itching to try the blender plugin and see what we can come up with!)

Here is the default project.mg I start with and tweak from there as needed.

Works for 90% of what I do with no other tuning but a lot of this is dependent on subject matter.

If you have not already noticed, bold entries in node parameters indicate a change from default.

Open in MR and add your data set.

GRAPHFILE.mg.zip

@julianrendell
Copy link
Author

@NexTechAR-Scott - appreciated! Had noticed the bold for changes. Looking forward to trying your settings on the sample dataset later today.

@stale
Copy link

stale bot commented Sep 22, 2020

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale for issues that becomes stale (no solution) label Sep 22, 2020
@stale
Copy link

stale bot commented Sep 30, 2020

This issue is closed due to inactivity. Feel free to re-open if new information is available.

@stale stale bot closed this as completed Sep 30, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stale for issues that becomes stale (no solution) type:question
Projects
None yet
Development

No branches or pull requests

5 participants