opa
opa copied to clipboard
Allow sorting test results by execution time
Having several hundred tests, it would be convenient if it was possible to quickly get an idea about which tests are the slowest and may need attention. Getting this information currently requires either manually sorting the output, or taking a JSON-formatted report and pushing it into another tool to have the results sorted by execution time. We can do better!
This issue has been automatically marked as inactive because it has not had any activity in the last 30 days. Although currently inactive, the issue could still be considered and actively worked on in the future. More details about the use-case this issue attempts to address, the value provided by completing it or possible solutions to resolve it would help to prioritize the issue.
A possible way to do this is to add a new flag for the test command called -s, --sort that accepts a string {duration}. Allowing for other ways to sort the tests in the future, maybe by package name? Or toggle if the duration should be sorted ascending or descending order {durationAsc, durationDes}? Slowest tests on top is probably the most common use case though so just providing duration that sorts in descending order would suffice.
When the --sort duration flag is combined with the formats json and gobench the change is straightforward. With json each test is a separate object within an array. and for gobench each test is a separate line. These can be easily reorganized without any breaking changes.
For the format pretty the tests are only printed if they fail or --verbose is set, and then the tests are organized by file. Sorting the tests would only make sense then when --verbose is also passed. So the options I can think of:
- Do nothing different if
--sortis passed, have it only work withjsonandgobench. - If
–sort durationand–verboseare passed print the passed/failed tests in descending order by duration but: a. Keep the tests organized under the file. Seems a bit useless because then you only know the slowest test per file which can be hard to find. b. Sort the tests regardless of the file (either dropping the filename or adding it next to the test name). - Extend the
Summarysection to list the top 5 slowest tests. Could do this regardless of the sort flag, but then each format still has a way to find the slowest tests.
I like the idea of option 3, because it seems “pretty”. Curious to hear what others think 😄
It's a power user option, so I think option 1 would be fine. #3 is also nice, but I'd probably only do that if some flag was passed to have that enabled. I'm sure there's all kinds of scripts that'd break if we change the default summary output 😅
Since the use case is really to identify the slowest tests, it might make sense to have a --limit option too. But OTOH, that'd be easy to accomplish by other means. So let's perhaps just keep it simple for a first implementation.
This issue has been automatically marked as inactive because it has not had any activity in the last 30 days. Although currently inactive, the issue could still be considered and actively worked on in the future. More details about the use-case this issue attempts to address, the value provided by completing it or possible solutions to resolve it would help to prioritize the issue.
Yesterday I found that the --timeout flag set to some low value (like 30ms) is a pretty good way to find out which tests are the slowest (as those will throw errors while the rest pass). Adjust the value as necessary to find the slowest of them. Not as great as being able to sort, but it's something, in case anyone finds this looking for a solution in the meantime.