p5.js icon indicating copy to clipboard operation
p5.js copied to clipboard

Adding Visual Test Report in Github Actions

Open Vaivaswat2244 opened this issue 9 months ago • 10 comments

Resolves #7618

This PR adds support for generating an HTML report to compare visual test results more effectively. The goal is to improve accessibility and readability of visual regression test outputs.

Changes:

  • Added a script to generate an HTML report summarizing visual test results.
  • The report includes expected screenshots and actual screenshots
  • Updated the workflow to run the added script after completion of tests
  • Edited the VisualTest.js to save the recorded screenshots to a folder actual-screenshots.
  • Then the generated html report is uploaded as an artifact.

Screenshots of the change: This is the screenshot of the html page generated. It will show expected and Actual visuals of this.

image @davepagurek @ksen0 Can you have a look at this? You can go ahead in the Actions under Summary and download the report from artifacts section.

PR Checklist

Vaivaswat2244 avatar Mar 21 '25 14:03 Vaivaswat2244

Nice work @Vaivaswat2244! I'm a little less familiar with github actions myself, @ksen0 and @limzykenneth, does this look OK to you?

davepagurek avatar Mar 25 '25 13:03 davepagurek

Hello @davepagurek @limzykenneth @ksen0

Just following up on this PR - I'm wondering if there's any additional feedback or if we should move forward with the implementation as is? If you'd prefer to explore alternative approaches first or if we are waiting for the pr05 project, I'd be happy to convert this to a draft PR in the meantime. Otherwise, I'm ready to address any feedback to get this merged.

Vaivaswat2244 avatar Apr 24 '25 19:04 Vaivaswat2244

Thanks for your patience @Vaivaswat2244 ! I will have time next week (when the GSoC reviews are done:) to review this, though if someone else reviews this before their feedback is welcome. Sorry for the wait.

ksen0 avatar Apr 24 '25 20:04 ksen0

Hey @ksen0 , can this be reviewed now?

Vaivaswat2244 avatar May 14 '25 05:05 Vaivaswat2244

Hi @Vaivaswat2244 thanks for all the patience on this and for the reminder. It's really nicely done! I've added notes on 3 points below, let me know what you think about all that, if you have any other ideas (or any questions).

The report looks great, especially the filter for "show only failed."

(1) There are false positives for FAIL - these tests passed but are missing images - could you investigate why this happens please?

image

(2) Very, very minor nice-to-haves - for consistency, use lowercase step names ("upload visual test report"). If it's easy to add, include link to PR or the # in the report html. Feel free to ignore this feedback.

(3) Major - execution conditions: if there is a failing case, then npm test fails and report is not generated, preventing it from being used. This is the desired outcome when tests fail:

image

In the example above I'm using:

name: Testing

on:
  push:
    branches:
      - main
      - dev-2.0
  pull_request:
    branches:
      - '*'

jobs:
  test:
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v1
      - name: Use Node.js 20.x
        uses: actions/setup-node@v1
        with:
          node-version: 20.x
      - name: Get node modules
        run: npm ci
        env:
          CI: true
      - name: build and test
        id: test
        run: npm test
        continue-on-error: true
        env:
          CI: true
      - name: Generate Visual Test Report
        if: always()
        run: node visual-report.js
        env:
          CI: true
      - name: Upload Visual Test Report
        if: always()
        uses: actions/upload-artifact@v4
        with:
          name: visual-test-report
          path: test/unit/visual/visual-report.html
          retention-days: 14
      - name: report test coverage
        if: steps.test.outcome == 'success'
        run: bash <(curl -s https://codecov.io/bash) -f coverage/coverage-final.json
        env:
          CI: true
      - name: fail job if tests failed
        if: steps.test.outcome != 'success'
        run: exit 1

I was able to test this by adding an artificial problem into the test suite:

    visualTest("with the default font", function (p5, screenshot) {
      p5.createCanvas(50, 50);
      p5.textSize(20);
      p5.textAlign(p5.LEFT, p5.TOP);
      p5.text("wrong text", 0, 0); // should be 'text' so should defintiely fail
      screenshot();
    });

This failure does show up in the report, but without actual image:

image

Looks like writeImageFile(actualFilename, toBase64(actual[i])); only runs on success, so in addition to updating the ci-test.yml, please check the logic of the visualTest.js. The goal is to ensure that report should help users see the visual test failures.

(4) [Edited to add:] Additional improvements - could be a separate PR, or can include in this one - could include a diff image, mirroring what visualTest.js provides:

image

If problem (3) is addressed, happy to merge!

ksen0 avatar May 16 '25 06:05 ksen0

Got it! I reviewed the code and yes currently the report is not generated in case of test failure. This is a logical misunderstanding from my side which I'll fix. Although regarding the false positives I'm not sure, I'll test the new logic and most probably that'll fix the (1) and (3) points. Also I'll update the script for adding diff as well. Thanks for the detailed review on the pr...

Vaivaswat2244 avatar May 17 '25 18:05 Vaivaswat2244

hey everyone, I've tried to fix the above mentioned issues. I've checked the logic of visualTest.js and it now writes actual-screenshots no matter what... Also I've updated the ci-test.yml file as @ksen0 recommended. Although I'm stuck on the false fail issue... This one test "Drawing triangles" passes on all the tests. Still the actual screenshot is not getting written in the actual-screenshots folder, when I've changed the logic and it is working for "intentionally failing test". Would be really helpful if someone helped @ksen0 @davepagurek I'm continuing to fix that. In the meantime I've commited fix for (3) and (4) for review.

Vaivaswat2244 avatar May 20 '25 20:05 Vaivaswat2244

Hey @davepagurek @ksen0 , apologies for the late response. Was busy with some college assignments.

I matched the false test which is reported in the report. It turns out that the test "Shape drawing/2D mode/Drawing triangles" is not present in the testing files (it is renamed to drawing with triangles) but the screenshot is still present. Hence, the script does not store anything for the actual-screenshot folder. And on matching with the screenshot and actual-screenshot, shows FAIL tag.

Vaivaswat2244 avatar Jun 10 '25 11:06 Vaivaswat2244

Hey @Vaivaswat2244 , I admire your work because it adds a really significant functionality to the CI/CD.

Would you be interested in a minor suggestion regarding performance? As far as I can tell the initialization of the regex inside the recursive function decreases the performance const flattenedName = testDir.replace(/\//g, '-');. If you agree, then I would like to fork your PR to see if my assumption is correct.

Again it's just a minor nice-to-have and not neccessary

error-four-o-four avatar Jun 10 '25 17:06 error-four-o-four

Thanks @error-four-o-four for your suggestion. I've initialised the regex before the recursive function. You can review that now. Along with that I've removed the intentional failing test and also the screenshots of "drawing triangles" test which was causing issues. Can @ksen and @davepagurek have a look as well?

Vaivaswat2244 avatar Jun 11 '25 09:06 Vaivaswat2244

Hey guys, @ksen0 @davepagurek. It's been some time since I looked into this. I've been thinking of adding a similar reporting system in the processing4 repository as the basis has been set up now (not pushed though). I'd love to hear any thoughts on this, whether this is appropriate, or if we can think of something better for the reporting systems of the visual testing aspects for both repositories.

Currently in the processing's system the initial structure of the visual testing system is set and working have a look. I'm currently working towards adding more tests to the system. In the meantime, I was hoping we could resolve this to get better insights into the reporting system.

Vaivaswat2244 avatar Oct 07 '25 05:10 Vaivaswat2244

Hi @Vaivaswat2244 , thanks so much for the update and the ping. Apologies this was not reviewed earlier. Could you resolve the merge conflicts on gitignore please? I am happy to merge when ready

ksen0 avatar Oct 14 '25 22:10 ksen0