Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tried an approch for text case in pixelmatch #7552

Open
wants to merge 2 commits into
base: dev-2.0
Choose a base branch
from

Conversation

Vaivaswat2244
Copy link

Resolves #7496

Changes:

Modified the checkMatch function in visualtest.js to be more tolerant of slight text position shifts while still catching meaningful changes. Key changes:

Added text detection using pixel pattern analysis:

  • Looks for moderate density and high contrast transitions typical of text
  • Helps identify regions where small shifts should be allowed

Implemented shift tolerance for text regions:

  • Allows up to 2-pixel shifts for detected text areas
  • Still fails on actual content changes
  • Configurable via shiftThreshold parameter

Screenshots of the change:

PR Checklist

@davepagurek
Copy link
Contributor

Thanks @Vaivaswat2244, nice approach! I have a few questions that might help us determine how to get this across the finish line.

  • Do you have a sense of how often isLikelyText returns true, and if this gets called on anything non-text? I wonder if maybe rather than trying to detect whether we think there is text, we can directly record when text is called. Since we pass in a p5 instance to the test case anyway, maybe we can do something like this to patch text() to record that it got called:

    let hasText = false
    const prevText = myp5.text
    myp5.text = function(...args) {
      hasText = true
      prevText.apply(this, args)
    }
  • It looks like some tests are failing on CI now. Do you know if these look like (1) false positives that the checker needs to address, (2) cases where we need to regenerate the screenshots because the old ones were out of date but still passed on CI under the old pixel checker, or (3) are actual bugs in p5 that are surfaced now by using a better pixel checker?

@Vaivaswat2244
Copy link
Author

Vaivaswat2244 commented Feb 16, 2025

yeah sure, the text() passing makes much more sense. I can implement that.
And regarding the tests getting failed. I think that these are the 2nd case as you described, failed the CI tests after regenerating on my machine but were not passed by the old checker (Although the previous one seemed fine). Also I'm not so sure how to move ahead from here. So ig for now I'll change the text case and will see how that works out.

@davepagurek
Copy link
Contributor

Are you able to see the test output from CI on your PR by clicking the Details button here?
image

For a next step, I'd look at the test output, and try regenerating the failed screenshots locally and pushing those. We can see if that fixes it on CI, and also by looking at the new images in the Files Changed tab, whether we think the changed images are expected.

@Vaivaswat2244
Copy link
Author

hey, so I just noticed that the files which I added in the earlier commit aren't the one which are failing the tests on CI, but they are other files which were not touched.

Now, I changed the files which were failing tests in CI, and tried the tests locally. The tests are now failing for different files in the typography and webgl cases. I guess there is a problem with my changes in the visualtest file.
So maybe I'll revert these changes and think of something else.

@davepagurek
Copy link
Contributor

Just looking at the last failure in the test file:

Expected:
image

Received:
image

Diff:
image

...it looks like a few isolated pixels are off. I wonder if we can do something to augment pixelmatch and check if the difference exists just on a single pixel (so no surrounding pixels are also flagged) and then ignore those?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants