ngsi-timeseries-api
ngsi-timeseries-api copied to clipboard
Get rid of sleep time in Docker tests
Is your feature request related to a problem? Please describe.
Alot of our tests depend on external Docker processes and can't be run until those processes are ready. At the moment tests that depend on Docker services get run by Bash scripts that bring up the services and then sleep a number of seconds before starting the pytest
testing session. While using long delays in practice works most of the time, in general it is a non-solution to the problem of syncing processes. Also that means long waits everywhere, even in environments where dependant services start up in the blink of an eye.
Describe the solution you'd like
@amotl suggested a nice way to get rid of those nasty sleep commands by using lovely-pytest-docker. That's a nice lib, but I'd be even nicer if docker_services.wait_for_service
let you:
- set a max time past which it should give up polling the service
- use a backoff strategy, even better if you could specify polling intervals, e.g. first poll after 1 sec, 2nd poll after 1 sec, 3rd poll after 10 secs, etc.
- catch a given set of exceptions E and retry the call if an exception gets raised that's in E, otherwise bail out
You can do all of these things with the backoff library, but backoff doesn't know how to start/stop Docker processes. So is there a way to get the convenience of lovely-pytest-docker and the flexibility of backoff? I think it could be as easy as wrapping this call:
- https://github.com/lovelysystems/lovely-pytest-docker/blob/master/src/lovely/pytest/docker/compose.py#L135
with a backoff decorator. Perhaps we could contribute that to the lovely-pytest-docker project and then use the new lovely-pytest-docker, even lovelier version in our code.
Describe alternatives you've considered
Just use backoff if there's no need to start/stop Docker services? For an example of using it have a look at reporter.tests.embedded_server.py
.
Additional context
See @amotl's comments to #441.
Notes
I think one question we should ask ourselves is why the majority of our tests depend on external services? Ideally, those tests should be the minority with unit tests taking the lion share. If most of the tests we cared about were unit tests, all this wouldn't be much of a problem. Lack of modularity and separation of concerns is what's brought us here I reckon. While this is the fate of most code bases that get developed over many years, we should still try writing clean code if we can---easier said than done given the huge pressure we're under, but trying doesn't hurt :-)
Hi,
I am happy that you liked my suggestion. As @burnes' reaction to my post also was positive, he might be willing to collaborate with you on adding more of those features to lovely-pytest-docker
. Also note that @jeromecremers already asked about the possibility to add parameters timeout
and pause
to the wait_for_service()
workhorse, see https://github.com/lovelysystems/lovely-pytest-docker/issues/19.
Extending that to an exponential backoff feature would be super nice, indeed. Actually using @bgreen-litl's / @litl's backoff
module for that job would be a blast.
With kind regards, Andreas.
Hi @amotl, yea it'd be great if someone could do that! Like I said, in principle using a decorator from the backoff library could do the trick...
in principle using a decorator from the backoff library could do the trick...
Indeed. If that would work together nicely, there would be no need to put it into lovely-pytest-docker
and it would be just a matter of appropriately documenting that as an "addon feature".
Stale issue message
in progress see #524
Stale issue message