p5.js icon indicating copy to clipboard operation
p5.js copied to clipboard

Improving Test Coverage

Open akshay-99 opened this issue 5 years ago • 16 comments

Hey, so I stumbled upon this report. image I felt there is certainly a lot of scope for work here. We can get it up to 80-85% total coverage to make things more robust. This includes adding tests for uncovered functions ( particularly in dom, image and io ).

But even then, coverage won't give the complete picture. For example, even though the above report would suggest that saveGif is covered, it doesn't mean much because there aren't comprehensive tests to check the output yet. I can also explore similar cases in the codebase where testing hasn't verified the final output. As another example, things that draw on the canvas are difficult to test. But if needed, individual changes can be tested using getimageData()?

Please share your thoughts on this. I would love to carry it ahead and also explore this as a GSoC project.

I am not sure if this is the right place to ask this.

akshay-99 avatar Mar 16 '20 15:03 akshay-99

Hey, even i am working on improving test coverage ratio, and i totally agree with your observation. As a matter of fact, starting from today, i am going to start working on it, atleast for 2 more weeks. So just to avoid stepping on each others toes, i think we should split up modules and work on them individually. Let me know what you think. Best of luck for your GSOC experience.

fenilgandhi avatar Mar 17 '20 06:03 fenilgandhi

For starters, i am picking up working on dom module.

fenilgandhi avatar Mar 17 '20 06:03 fenilgandhi

@fenilgandhi I am waiting for a nod from some of the people here and hearing their opinions before jumping into things. I didn't know that you were working on the test coverage too, seeing the daily good work you are putting in with the inline documentation task. But yes, we can work together!

@outofambit @stalgiag @limzykenneth what do you think about the coverage and the verifying the final output even for some already tested functions thing?

akshay-99 avatar Mar 17 '20 08:03 akshay-99

@akshay-99 @fenilgandhi you both are doing great work...well it will be nice if test case coverage area is increased... especially for WebGL part for me.....btw have you submitted your draft proposal or you still waiting to get it reviewed here

ahujadivyam avatar Mar 17 '20 14:03 ahujadivyam

@akshay-99, since you are starting off afresh, i have a suggestion which you can use, i read through test files of a few modules and realised that the tests have a lot of disparity in them. Maybe as a part of improving the test coverage, you can also document the guidelines for unit testing and add them to unit testing contributor documentation .

In a long term scenario, this would improve not only the coverage but also the quality of the test. If you want to explore this suggestion, here's a reference for unit testing guidelines

fenilgandhi avatar Mar 18 '20 11:03 fenilgandhi

@fenilgandhi though I agree that the tests have some disparity in them, I felt most of it was a necessary difference of style needed for that particular unit. And I also feel that the guidelines are sufficiently documented giving an example and structure as of now. And I feel putting too much focus on pedantic details that cannot automatically be enforced is counterintuitive and may discourage future contributors. I may be wrong.

akshay-99 avatar Mar 18 '20 15:03 akshay-99

I'm not too involved with the testing setup but as with all other parts of the p5.js projects the key thing to consider is the beginner experience. We want to encourage more people to take a peek at the underlying code (including test codes) without the need of too much prior knowledge.

limzykenneth avatar Mar 20 '20 11:03 limzykenneth

@limzykenneth I agree that beginner friendliness is a prime objective. From experience, I can recount an issue which I faced while exploring the tests for the first time as a beginner. As far as I can see, the It doesn't mention anywhere why there are three different runs of tests

  1. Examples
  2. Unminified JS
  3. Minified JS

Why 2nd and 3rd runs are different was confusing for me to figure out at first because they run the same test files again. This was especially confusing as a test for FES was failing in the 3rd run and passing in the second.

Also running tests with npm run test runs the entire bunch. Even if I use .only() it will still run the examples. This takes a whole lot of time if I am following a trial and error approach and just changing a couple of lines in a test. I figured I can skip the build process for quick testing using npm run grunt test:nobuild. I can even skip the example testing by commenting out a few lines in the Gruntfile. This was extremely helpful for me to speed up my work. And as far as I can see, it hasn't been mentioned anywhere.

Should these couple of explanations and tips & tricks be mentioned?

akshay-99 avatar Mar 20 '20 11:03 akshay-99

My $0.02, you mentioned that it takes a lot of time for trial and error approach, for which you can use the --keepalive flag or better add the following command to package.json file.

"test:dev" : "grunt test --keepalive",

fenilgandhi avatar Mar 20 '20 11:03 fenilgandhi

@fenilgandhi sorry I didn't quite get you :sweat_smile: I tried what you said and it starts up a development server. But it doesn't seem to track anything? I tried changing updating the tests and the source but it doesn't seem to do anything when a change is made. Maybe I might be doing something wrong?

akshay-99 avatar Mar 20 '20 18:03 akshay-99

Ok, let me more clear.

  • Add a .only to the suite of your module and your method.
  • You can add the given line to package.json and then run npm run test:dev.
  • It will run the server and watch for changes in files.
  • Open http://localhost:9001/test/test.html and you can verify your tests there. After making changes, you can refresh this page to get latest results. Also you can click on tests to get more info.

fenilgandhi avatar Mar 21 '20 02:03 fenilgandhi

@fenilgandhi I get you now. That was a great tip! Thanks!

akshay-99 avatar Mar 21 '20 19:03 akshay-99

This has been here for some time now. :sweat_smile: @outofambit @lmccart @limzykenneth @stalgiag I could start working on the above stuff right away but I don't want to start with a barrage of pull requests without any go-ahead of sorts :sweat_smile:

Please tell me if you think some or all of the changes described above would help the library

akshay-99 avatar Mar 23 '20 16:03 akshay-99

I'm not following this issue very well so if you can clarify a bit for me that would be great. Is this about improving test coverage by creating missing unit tests or is it about looking at the way tests are structured and run?

limzykenneth avatar Mar 23 '20 17:03 limzykenneth

@limzykenneth Though I originally created it for adding new unit tests and to extend some existing ones, as the discussion progressed I realised there are some things that could be added to the contributor docs that could help beginners. I think this fits under the umbrella of "improvement" too so I didn't think it needed a separate issue

akshay-99 avatar Mar 23 '20 18:03 akshay-99