openfast
openfast copied to clipboard
Timing check in automated tests
Is your feature request related to a problem? Please describe. It was recently noticed that this commit slows down the time to solution for given BeamDyn simulation by approximately 5x. A timing comparison for each test case should be included as part of the automated tests.
Describe the solution you'd like Ideally, this does not require manually maintaining a list of execution times. I'd like to see the automated testing system automatically maintain the execution times for each test case.
Is that for a standalone driver case, or fully coupled case? I'm a bit surprised that commit slowed down the speed.
All the changes referenced are during the initialization step, so this may be an acceptable speed decrease there given the increased accuracy and stability that commit introduces.
@andrew-platt I agree, it may be an acceptable performance hit in this case. Ideally, though, we are not surprised by these changes and instead get feedback on it from the automated system during the development process.
I agree that picking this up in the automated testing would be very useful. We will undoubtedly catch some unintended consequences.
It might also be useful at some point to report the time spent on initialization separately from the overall simulation time.
Yes good point, it would be great to report those two separately.
The regression test log files do already report simulation CPU time separately from the total CPU run time (subtracting the two gives the initialization time). For example:
Total Real Time: 16.28 seconds
Total CPU Time: 16.266 seconds
Simulation CPU Time: 15.609 seconds
Simulated Time: 60 seconds
Time Ratio (Sim/CPU): 3.8438
The time ratio reported there does NOT include initialization timings.
The Envision code base includes this timing info at the end of our summary files, but I don't think that's the case with OpenFAST.
@bjonkman, this is great! I'm not sure all the driver codes report both.