report improvements that can be considered
- Downloaded CSV can contain full details of the test case including column header to say test attempt and time taken
- Unable to understand what exactly is the error
If throughput or latest so so.. or allow with a hyperlink to dig deeper on errors Also, if error, why is nothing in the graph red in color, they are still shown as blue
- HTML report can have as header time stamp
- Size of these html files are too huge, that way browsers and IDEs are unable to show or open
graphs are not shown
Also download as CSV (may be a name of performance timing report can be a better name) is showing 100 iterations
Downloaded CSV can contain full details of the test case including column header to say test attempt and time taken
In the html report, the CSV download just contains the latency percentile distributions. There will always be 100 distributions representing the 1th percentile up to the 100th percentile. The percentile distribution tells you the percentage of request at or below a certain latency (i.e. if you hover over the graph you will see something like "95% of requests completed in 545ms or less)
Size of these html files are too huge, that way browsers and IDEs are unable to show or open
If the size of the html reports are too large you can look at using one of the other default reporters (i.e. Console reporter or CSV reporter)
Alternatively you can provide a custom class that implements the ReportGenerator interface and customise the reports.
You can also provide a customised version of report.template in your src/main/resources dir and this template will be used when generating Html reports.
Unable to understand what exactly is the error
To dig deeper into errors you can enable trace logging for the EvaluationTask class (or the whole com.github.noconnor.junitperf package) This will tell you exactly what errors are being asserted.
How you enable this logging will depend on the logging framework you use.
This logging is not enabled by default as it may be excessively noisy for long running perf tests that expect some level of error.
The label Latency in y-axis can say the unit as ms, though the tooltip is any way giving that as ms (Milliseconds)
- Size of these html files are too huge, that way browsers and IDEs are unable to show or open
graphs are not shown
![]()
Did you change anything to reduce html file size.. as today with x.34 I see same tests create just 1.69 MB html file.. just confirming to make sure my previous observation was not wrong..
I didn't change anything to do with report generation other then add skipped tests.
Would you have overridden the src/main/resources/report.template by any chance?
I have not made any changes to report.template..unable to reproduce 25mb file now.. let me observe further to know when such a huge size is coming in..
graphs are not shown