futurecoder icon indicating copy to clipboard operation
futurecoder copied to clipboard

Increase test coverage in the backend

Open alexmojaki opened this issue 4 years ago • 2 comments

The tests mostly cover the golden paths, i.e. users getting steps right or triggering a message step. There's lots of untested code. It can be found pretty easily by running tests with coverage. Some stuff is tested by test_frontend but that's slow and more difficult to measure coverage for so we should have pure Python tests for it. I'm particularly interested in the edge cases of bad code submissions from users, e.g:

  • Messing up an ExerciseStep in various ways, e.g. writing a solution that satisfies your own sample inputs but not the other tests.
  • Messages in steps that are returned manually, i.e. return dict(message='...') instead of a MessageStep. These tests would be specific to such steps.
  • Submissions that trigger Disallowed
  • Submissions that trigger linting errors

alexmojaki avatar Feb 21 '21 10:02 alexmojaki

Are these tests for the 'core' module? Where are the existing tests located? I have seen some tests based on selenium which is for front end

harimm avatar Nov 03 '21 15:11 harimm

It's all under tests. test_frontend uses selenium, the other two test code within core. The tests described in the issue would be a bit like test_steps but more targeted.

BTW the 'backend' used to be a Django server, now it just refers to core which runs in the browser in Pyodide. But you shouldn't need to know anything about Pyodide to write and run these kinds of tests, regular 'local' Python and pytest should work.

alexmojaki avatar Nov 03 '21 15:11 alexmojaki