dask-jobqueue icon indicating copy to clipboard operation
dask-jobqueue copied to clipboard

Add a CONTRIBUTING.md with how to run the tests

Open lesteve opened this issue 6 years ago • 6 comments

This would be very useful, I am just jotting down things quickly so some things may not be completely accurate. If someone wants to do it, that would be more than welcome!

I think for now adding a CONTRIBUTING.md and possibly add a link from the README.md is good enough. An alternative is to put it somewhere in the doc but this is a bit more effort.

tests who don't need a cluster

pytest dask_jobqueue

tests who need a cluster

They have @pytest.mark.env("scheduler-name") before the test function. With a cluster (cluster-name needs to match the name in the pytest.mark.env):

pytest dask_jobqueue -E <cluster-name>

e.g. on my SGE cluster I do:

pytest dask_jobqueue -E sge

running tests locally with our docker-compose setup

For some clusters (SGE, PBS, SLURM) at the time of writing, we have a docker-compose setup to run the tests locally.

cd ci/sge
./start-sge.sh

Sometimes there is a problem (never figured out when exactly) and you want to regenerate the docker images from scratch (there may be a better way to do this to be honest, because my feeling is that you want only to rerun the part of the conda install in the docker image not the full thing):

cd ci/sge
docker-compose build --no-cache
./start-sge.sh

lesteve avatar Aug 23 '19 09:08 lesteve

@basnijholt, since you opened a few PRs recently, and if you are interested on working on this and you happen to have some free time of course, this would be more than welcome!

lesteve avatar Aug 23 '19 09:08 lesteve

@lesteve shouldn't this be put in https://jobqueue.dask.org/en/latest/develop.html ?

guillaumeeb avatar Aug 23 '19 10:08 guillaumeeb

Indeed, I didn't know about the dev docs (or I forgot), this is nice.

I guess the main thing that needs to be added is:

  • can you run the tests on a cluster with pytest? This is useful for job schedulers where we don't have any docker-compose setup but I am not sure it actually works. You may need to raise QUEUE_WAIT and maybe you need additional tweaks (for example to activate the right conda environments or define the right queue). In any case it would be nice if we had a way to run tests on real clusters and this was in the developer doc.
  • the docker-compose build --no-cache (I got it from dask-drmaa actually: https://github.com/dask/dask-drmaa#testing)
  • not crucial but I am a bit unsure about the jobqueue_before_install, jobqueue_install, etc ..., I'd rather have the ./start-sge.sh cluster and the docker command as in the dask-drmaa doc above. I agree then the CI and the local test can diverge but at least you understand a bit better what you are doing ...

lesteve avatar Aug 23 '19 10:08 lesteve

This would definitely be helpful. I'm especially interested in the information here given that the HTCondor testing is currently... limited and would be interested in helping to improve that.

mivade avatar Aug 23 '19 15:08 mivade

@mivade help on improving HTCondor testing would be more than welcome!

There are two things as far as I can tell:

  • add tests that don't need a cluster, this is only testing cluster.job_script(). I think there are already some tests like this in dask_jobqueue/tests/test_htcondor.py
  • gettting a similar docker-compose set-up as we have for SLURM, PBS and SGE. This requires a bit more effort but https://github.com/dask/dask-jobqueue/issues/247#issuecomment-473418030 may help.

lesteve avatar Aug 26 '19 08:08 lesteve

@lesteve, thanks for the pointers! I'll see if I can find some time in the coming days to work on this.

mivade avatar Aug 26 '19 14:08 mivade