tutorialkit
tutorialkit copied to clipboard
Add `_tests` folder with a set of test file to run against the application
Is your feature request related to a problem?
It would be cool to be able to provide a set of test file that will run against the application to validate if the solution is actually passing.
Describe the solution you'd like.
Add the ability to create a _test folder with tests for that specific lesson.
Describe alternatives you've considered.
Additional context
No response
You can add test cases in the _files and src/templates, and then create a script in package.json that runs your test script. Then in the mainCommand you can define this script to be run.
Something like this should work:
src/
├── content
│ └── tutorial
│ ├── meta.md
│ └── math
│ └── exercies
│ ├── meta.md
│ └── sum
│ ├── meta.md
│ ├── _files
│ │ └── sum.js
│ └── _solution
│ └── sum.js
└── templates
└── default
├── sum.test.js
└── package.json
// _files/sum.js
export function sum(a, b) {
return 0; // TODO Fix me
}
// _solution/sum.js
export function sum (a, b) {
return a + b;
}
// templates/default/sum.test.js
test('sum adds numbers', () => {
expect(sum(2, 3)).toBe(5);
});
// package.json
"scripts": {
"test": "<your test runner command here>"
}
// meta.md
mainCommand: ["npm run test", "Running tests"]
@AriPerkkio would that gate users from proceeding to the next lesson (or some other type of indication that they "successfully completed" the challenge)? If not, I think that sort of functionality would be super useful
We don't yet provide any way for preventing users from proceeding the tutorials without completing previous lessons.
I think for this feature we could add support for:
- Run test cases against code editor's current code and preview's DOM
- Show results of these test cases in the UI
- Optionally set UI to hide "Next lesson"-link when test cases do not pass. Note that this doesn't prevent navigating via URL as TutorialKit's Astro builds are just static files.
For the test cases we could provide our own test() / it() function that either renders ✅ on UI when pass, or ❌ when the given callback throws.
@AriPerkkio would that gate users from proceeding to the next lesson (or some other type of indication that they "successfully completed" the challenge)? If not, I think that sort of functionality would be super useful
Yeah that was my idea and i think i like this idea from @AriPerkkio
For the test cases we could provide our own test() / it() function that either renders ✅ on UI when pass, or ❌ when the given callback throws.