ara icon indicating copy to clipboard operation
ara copied to clipboard

View performance test results

Open TroyonGuillaume opened this issue 4 years ago • 2 comments

What it's the actual situation on ARA ? Currently we are displaying the results of the integration tests & GUI in ARA.

Describe the solution you'd like We would like to display the results of the performance tests

TroyonGuillaume avatar Oct 07 '20 07:10 TroyonGuillaume

While running a performance test with JMeter, a json is generated at the end of the test with all the stats of the requests in the file statistics.json.

Example of the file output :

    "R-HTTP_REQUEST_1": {
        "transaction": "R-HTTP_REQUEST_1",
        "sampleCount": 10,
        "errorCount": 0,
        "errorPct": 0.0,
        "meanResTime": 198.0,
        "minResTime": 123.0,
        "maxResTime": 498.0,
        "pct1ResTime": 483.80000000000007,
        "pct2ResTime": 498.0,
        "pct3ResTime": 498.0,
        "throughput": 0.0765374459454288,
        "receivedKBytesPerSec": 0.11641315630859898,
        "sentKBytesPerSec": 0.08741263968083884
    },
    "R-HTTP_REQUEST_2": {
        "transaction": "R-HTTP_REQUEST_2",
        "sampleCount": 10,
        "errorCount": 0,
        "errorPct": 0.0,
        "meanResTime": 456.99999999999994,
        "minResTime": 287.0,
        "maxResTime": 844.0,
        "pct1ResTime": 843.4,
        "pct2ResTime": 844.0,
        "pct3ResTime": 844.0,
        "throughput": 0.07663246304399471,
        "receivedKBytesPerSec": 4.478411617385607,
        "sentKBytesPerSec": 0.07794209986359422
    },
    "R-HTTP_REQUEST_3": {
        "transaction": "R-HTTP_REQUEST_3",
        "sampleCount": 10,
        "errorCount": 0,
        "errorPct": 0.0,
        "meanResTime": 63.9,
        "minResTime": 45.0,
        "maxResTime": 143.0,
        "pct1ResTime": 138.60000000000002,
        "pct2ResTime": 143.0,
        "pct3ResTime": 143.0,
        "throughput": 0.07721948093064918,
        "receivedKBytesPerSec": 0.09226370597524343,
        "sentKBytesPerSec": 0.09279157352066779
    },
    "R-HTTP_REQUEST_4": {
        "transaction": "R-HTTP_REQUEST_4",
        "sampleCount": 10,
        "errorCount": 0,
        "errorPct": 0.0,
        "meanResTime": 204.20000000000002,
        "minResTime": 123.0,
        "maxResTime": 431.0,
        "pct1ResTime": 424.3,
        "pct2ResTime": 431.0,
        "pct3ResTime": 431.0,
        "throughput": 0.07776956876774117,
        "receivedKBytesPerSec": 1.6890198482521288,
        "sentKBytesPerSec": 0.08817428646420655
    },
    "Total": {
        "transaction": "Total",
        "sampleCount": 120,
        "errorCount": 0,
        "errorPct": 0.0,
        "meanResTime": 196.2166666666666,
        "minResTime": 0.0,
        "maxResTime": 1125.0,
        "pct1ResTime": 491.3000000000004,
        "pct2ResTime": 659.9999999999995,
        "pct3ResTime": 1122.48,
        "throughput": 0.829611603500961,
        "receivedKBytesPerSec": 11.706644746052433,
        "sentKBytesPerSec": 0.7339915746892414
    }
}

The first step is to ingest these datas on ARA site. Then it would be useful to map HTTP requests with the API carthography so the performance tests would be related with features. Then it would be easier to use the results on the dashboard ?

The goal in continous integration is to keep a trend of the same request and on front side to be able to graph the response time for each

z28rbill avatar Oct 07 '20 09:10 z28rbill

On another hand, it would be helpful to map the performance scenarios / requests to the "Functionality Cartography". So we could see the "performance coverage"

z28rbill avatar Oct 08 '20 07:10 z28rbill