PSyclone
PSyclone copied to clipboard
Use doctest to validate example code in docstrings (esp. for Transformations)?
I've been vaguely aware of doctest (https://docs.python.org/3/library/doctest.html) for a while but have just had a play with it and think it could be useful for us to make sure that our docstrings that contain example code (particularly in the various Transformation classes) are kept up-to-date. Usage is:
$ python -m doctest -v domain/nemo/transformations/create_nemo_invoke_schedule_trans.py
or Sphinx provides the doctest
target (but I'm not entirely sure which docstrings are in scope when you do that).
I've been playing with it in #1217 and it seems good. (The only problem I have is that the Node.view()
method always produces coloured output and that causes the text comparison done by doctest to fail. We could add an optional colour
argument to that method though.)
What do @rupertford, @sergisiso, @TeranIvy and @hiker think?
I've just had a play and like it.
I was worried about any code snippets that are not executable but we can presumably not add >>>
to those - but if so will they be picked up by Sphinx?
Also, presumably you were thinking of adding it as part of the CI?
For info: it took me a while to work out that continuation lines should have "... "!
Yes, I forgot to mention the continuation lines - it took me quite a while to get it working too!
For Sphinx, we can always tell it that code snippets are Python (but I think that's its default anyway) - it doesn't need the >>>
.
I think it makes sense to do it as part of the CI - it shouldn't add much to the run time.
The only snag is that it makes the docstrings very verbose but then, they are almost certainly a lot more useful if the code fragments they show are actually correct :-)
My only concern would be that the examples need to become a lot more elaborate since we then would have to have fully working code, not just a few lines to show the important lines. And if we would require working examples, it would also mean more work for us ... but:
I am not exactly sure if I ever added a 'real' example, I add them mostly to the manual, and I admit that's not ideal, since that code is not tested :(
Would there be any chance to combine this? I vaguely remember that we can refer to py files in the documentation, including line numbers to be included. That might be nice.
Yes, that is the main drawback although, as you say, we only need to go this route in a few places. I believe doctest can do its thang on pretty much any file - it just looks for '>>>' - so we could use it to test the manual. I'm not sure what you mean by 'combin this'? It might be that that's what Sphinx's doctest
build target does. I shall have a google.
Well, google tells me that we can mark-up sections of our documentation and then doctest will test them (https://www.sphinx-doc.org/en/master/usage/extensions/doctest.html). However, even without doing that, if you do make doctest
on our current documentation then some 'tests' get run. Maybe it does pick up the docstrings from classes imported into the documentation?
The sphinx doctest markup allows you to specify the 'test environment' setup just once so that ought to cut down on the number of imports we'd need.
'combine' as in: have examples actually tested, and automatically included in our docs, i.e. we could guarantee that the examples in our documentation works (I believe some of the dependency code examples was outdated). That all sounds great!
Since some of the docs I had to update in #1294 contained code fragments, I've done a bit of this work there. It seems that you do have to add >>>
if you want the code to be picked up and adding the various module imports under a .. testsetup::
section works fine.
https://pymotw.com/2/doctest/ is very useful. In principle we can run the tests in a single file by doing e.g. python -m doctest -v psyir.rst
but this obviously doesn't understand the Sphinx-specific testsetup
section. All tests in a file share the execution environment so any imports only need to be done once.
While reviewing #1846 I realised that we still can't do python -m doctest transformations.py
(or rather that it reports a lot of failures). Rather than drag out #1846, we can look at that here. There's a problem in that we've parameterised some of the docstrings to use a variable (giving the location of a source file) but that variable is only set in e.g. the developer guide source.