s3-benchmark
s3-benchmark copied to clipboard
Is parameter samples ignored?
It looks like -samples 1000000
or, eg. -samples 100
makes no difference. Shouldn't 1000000 samples take longer than 100?
Ok got it. Looks like you overwrite the value by
samples := getTargetSampleCount(threadCount, samples)
not sure if that makes sense... Maybe you should allow users to enable and disable this overwrite? Any I guess samples should be raltive to the object size? It would take quite long to get 100000 32MB objects? So maybe instead of overwriting it you allow a time based benchmark? so each sample size is running for the same amount of time?