futurecoder
futurecoder copied to clipboard
Increase test coverage in the backend
The tests mostly cover the golden paths, i.e. users getting steps right or triggering a message step. There's lots of untested code. It can be found pretty easily by running tests with coverage. Some stuff is tested by test_frontend but that's slow and more difficult to measure coverage for so we should have pure Python tests for it. I'm particularly interested in the edge cases of bad code submissions from users, e.g:
- Messing up an
ExerciseStepin various ways, e.g. writing a solution that satisfies your own sample inputs but not the other tests. - Messages in steps that are returned manually, i.e.
return dict(message='...')instead of aMessageStep. These tests would be specific to such steps. - Submissions that trigger
Disallowed - Submissions that trigger linting errors
Are these tests for the 'core' module? Where are the existing tests located? I have seen some tests based on selenium which is for front end
It's all under tests. test_frontend uses selenium, the other two test code within core. The tests described in the issue would be a bit like test_steps but more targeted.
BTW the 'backend' used to be a Django server, now it just refers to core which runs in the browser in Pyodide. But you shouldn't need to know anything about Pyodide to write and run these kinds of tests, regular 'local' Python and pytest should work.