mujoco
mujoco copied to clipboard
Best practices for automated testing in downstream libraries?
Hi,
I'm writing a package for animal biomechanics simulation using MuJoCo. I'm looking for some help with testing and automation.
Since exact reproducibility of the engine "is only guaranteed within a single version, on the same architecture" (according to the Reproducibility section of the docs), what are the best practices for automated testing in downstream libraries that use MuJoCo? Clearly asserting that results are exactly as expected is a bad idea...
Thank you in advance for your suggestions.