CodeJam icon indicating copy to clipboard operation
CodeJam copied to clipboard

No more Benchmark runs will be launched as NO measurements were obtained from the previous run!

Open megakid opened this issue 7 years ago • 14 comments

Using the exact dependencies (NUnit 3.7.1, Benchmark.NET 0.10.8 and CodeJam.PerfTests.NUnit 0.1.4-beta) and the wiki example:

    [Category("PerfTests: NUnit examples")]
    [CompetitionAnnotateSources] // Opt-in feature: source annotations.
    public class SimplePerfTest
    {
        private const int Count = 200;

        // Perf test runner method.
        [Test]
        public void RunSimplePerfTest() => Competition.Run(this);

        // Baseline competition member.
        // All relative metrics will be compared with metrics of the baseline method.
        [CompetitionBaseline]
        public void Baseline() => Thread.SpinWait(Count);

        // Competition member #1. Should take ~3x more time to run.
        [CompetitionBenchmark]
        public void SlowerX3() => Thread.SpinWait(3 * Count);

        // Competition member #2. Should take ~5x more time to run.
        [CompetitionBenchmark]
        public void SlowerX5() => Thread.SpinWait(5 * Count);

        // Competition member #3. Should take ~7x more time to run.
        [CompetitionBenchmark]
        public void SlowerX7() => Thread.SpinWait(7 * Count);
    }

I get the following:

Test completed with errors, details below.
Errors:
    * Run #1: No result reports for benchmarks: Baseline, SlowerX3, SlowerX5, SlowerX7.
at CodeJam.PerfTests.Running.Core.NUnitCompetitionRunner.ReportExecutionErrors(String messages, CompetitionState competitionState) in C:\Users\igors\Source\Repos\CodeJam\PerfTests\src-NUnit\Running.Core\NUnitCompetitionRunner.cs:line 81
at CodeJam.PerfTests.Running.Core.CompetitionRunnerBase.RunCore(Type benchmarkType, ICompetitionConfig competitionConfig, CompetitionFeatures competitionFeatures) in C:\Users\igors\Source\Repos\CodeJam\PerfTests\src\[L5_AllTogether]\Running.Core\CompetitionRunnerBase.cs:line 233



BenchmarkDotNet=v0.10.8, OS=Windows 10 Redstone 1 (10.0.14393)
Processor=Intel Core i7-7700 CPU 3.60GHz (Kaby Lake), ProcessorCount=8
Frequency=3515621 Hz, Resolution=284.4448 ns, Timer=TSC
  [Host] : Clr 4.0.30319.42000, 64bit RyuJIT-v4.7.2558.0

Job=Competition  Force=False  Toolchain=InProcessToolchain  
InvocationCount=256  LaunchCount=1  RunStrategy=Throughput  
TargetCount=256  UnrollFactor=16  WarmupCount=128  
AdjustMetrics=True  

   Method | Mean | StdDev | Scaled | Scaled-StdDev | GcAllocations |
--------- |-----:|-------:|-------:|--------------:|--------------:|
 Baseline |   NA |     NA |   1.00 |          0.00 |             ? |
 SlowerX3 |   NA |     NA |      ? |             ? |             ? |
 SlowerX5 |   NA |     NA |      ? |             ? |             ? |
 SlowerX7 |   NA |     NA |      ? |             ? |             ? |

Benchmarks with issues:
  SimplePerfTest.Baseline: Competition(Force=False, Toolchain=InProcessToolchain, InvocationCount=256, LaunchCount=1, RunStrategy=Throughput, TargetCount=256, UnrollFactor=16, WarmupCount=128)
  SimplePerfTest.SlowerX3: Competition(Force=False, Toolchain=InProcessToolchain, InvocationCount=256, LaunchCount=1, RunStrategy=Throughput, TargetCount=256, UnrollFactor=16, WarmupCount=128)
  SimplePerfTest.SlowerX5: Competition(Force=False, Toolchain=InProcessToolchain, InvocationCount=256, LaunchCount=1, RunStrategy=Throughput, TargetCount=256, UnrollFactor=16, WarmupCount=128)
  SimplePerfTest.SlowerX7: Competition(Force=False, Toolchain=InProcessToolchain, InvocationCount=256, LaunchCount=1, RunStrategy=Throughput, TargetCount=256, UnrollFactor=16, WarmupCount=128)

============= SimplePerfTest =============
// ? Squid.Strategies.Tests.Collect2.SimplePerfTest, Squid.Strategies.Tests

---- Run 1, total runs (expected): 1 -----
ExitCode != 0
No more Benchmark runs will be launched as NO measurements were obtained from the previous run!
// Exception: System.InvalidOperationException: Sequence contains no elements
   at System.Linq.Enumerable.Last[TSource](IEnumerable`1 source)
   at BenchmarkDotNet.Running.BenchmarkRunnerCore.Execute(ILogger logger, Benchmark benchmark, IToolchain toolchain, BuildResult buildResult, IConfig config, IResolver resolver, GcStats& gcStats)
   at BenchmarkDotNet.Running.BenchmarkRunnerCore.Run(Benchmark benchmark, ILogger logger, IConfig config, String rootArtifactsFolderPath, Func`2 toolchainProvider, IResolver resolver)
ExitCode != 0
No more Benchmark runs will be launched as NO measurements were obtained from the previous run!
// Exception: System.InvalidOperationException: Sequence contains no elements
   at System.Linq.Enumerable.Last[TSource](IEnumerable`1 source)
   at BenchmarkDotNet.Running.BenchmarkRunnerCore.Execute(ILogger logger, Benchmark benchmark, IToolchain toolchain, BuildResult buildResult, IConfig config, IResolver resolver, GcStats& gcStats)
   at BenchmarkDotNet.Running.BenchmarkRunnerCore.Run(Benchmark benchmark, ILogger logger, IConfig config, String rootArtifactsFolderPath, Func`2 toolchainProvider, IResolver resolver)
ExitCode != 0
No more Benchmark runs will be launched as NO measurements were obtained from the previous run!
// Exception: System.InvalidOperationException: Sequence contains no elements
   at System.Linq.Enumerable.Last[TSource](IEnumerable`1 source)
   at BenchmarkDotNet.Running.BenchmarkRunnerCore.Execute(ILogger logger, Benchmark benchmark, IToolchain toolchain, BuildResult buildResult, IConfig config, IResolver resolver, GcStats& gcStats)
   at BenchmarkDotNet.Running.BenchmarkRunnerCore.Run(Benchmark benchmark, ILogger logger, IConfig config, String rootArtifactsFolderPath, Func`2 toolchainProvider, IResolver resolver)
ExitCode != 0
No more Benchmark runs will be launched as NO measurements were obtained from the previous run!
// Exception: System.InvalidOperationException: Sequence contains no elements
   at System.Linq.Enumerable.Last[TSource](IEnumerable`1 source)
   at BenchmarkDotNet.Running.BenchmarkRunnerCore.Execute(ILogger logger, Benchmark benchmark, IToolchain toolchain, BuildResult buildResult, IConfig config, IResolver resolver, GcStats& gcStats)
   at BenchmarkDotNet.Running.BenchmarkRunnerCore.Run(Benchmark benchmark, ILogger logger, IConfig config, String rootArtifactsFolderPath, Func`2 toolchainProvider, IResolver resolver)
There are no any results runs
There are no any results runs
There are no any results runs
There are no any results runs
Benchmarks with issues:
  SimplePerfTest.Baseline: Competition(Force=False, Toolchain=InProcessToolchain, InvocationCount=256, LaunchCount=1, RunStrategy=Throughput, TargetCount=256, UnrollFactor=16, WarmupCount=128)
  SimplePerfTest.SlowerX3: Competition(Force=False, Toolchain=InProcessToolchain, InvocationCount=256, LaunchCount=1, RunStrategy=Throughput, TargetCount=256, UnrollFactor=16, WarmupCount=128)
  SimplePerfTest.SlowerX5: Competition(Force=False, Toolchain=InProcessToolchain, InvocationCount=256, LaunchCount=1, RunStrategy=Throughput, TargetCount=256, UnrollFactor=16, WarmupCount=128)
  SimplePerfTest.SlowerX7: Competition(Force=False, Toolchain=InProcessToolchain, InvocationCount=256, LaunchCount=1, RunStrategy=Throughput, TargetCount=256, UnrollFactor=16, WarmupCount=128)
// ! #1.1  01.442s, ExecutionError@Analyser: No result reports for benchmarks: Baseline, SlowerX3, SlowerX5, SlowerX7.
// ! Hint: Ensure that benchmarks were run successfully and did not throw any exceptions.
// ! Breaking competition execution. High severity error occured.

megakid avatar Feb 15 '18 14:02 megakid

@megakid Oops. Will fix ASAP

ig-sinicyn avatar Feb 15 '18 15:02 ig-sinicyn

Fixed ?

NN--- avatar Oct 22 '18 17:10 NN---

AFAIR, I did not push the fix. Current version of perftests in CJ is kinda outdated. Going to update it on next month.

ig-sinicyn avatar Oct 22 '18 18:10 ig-sinicyn

@ig-sinicyn Any progress there ? I can try updating Benchmark package without solving the issue

NN--- avatar Jan 05 '19 16:01 NN---

@NN--- Sorry, I'm kinda out of time. Will try to post fix on weekends.

ig-sinicyn avatar Jan 09 '19 11:01 ig-sinicyn

Updates ? Can I add project to solution ? It would be nice to build everything and publish help automatically.

NN--- avatar Feb 18 '19 18:02 NN---

Yep:) I've updated the project to the latest BDN and will push final fixes sometime after BDN 10.14 release. Have almost no free time so there may be a little delay.

ig-sinicyn avatar Feb 18 '19 18:02 ig-sinicyn

I wonder whether we should have different solution and docs project for perftests or not. Right now the perftest doc is a separate file and needs a separate build.

NN--- avatar Feb 18 '19 18:02 NN---

@ig-sinicyn Check whether you can require lower version for any dependency like NUnit 3.0.0 instead of 3.12 and so on.

NN--- avatar Feb 18 '19 19:02 NN---

Yep, perftests will move into separate solution, CJ is stable enough and there's no profit to keep both projects in a single solution.

Check whether you can require lower version for any dependency like NUnit 3.0.0 instead of 3.12 and so on.

Will do.

ig-sinicyn avatar Feb 18 '19 19:02 ig-sinicyn

I guess we will see no progress :( What is still missing ?

NN--- avatar May 10 '19 18:05 NN---

TBH, I have almost no free time for this. This summer, may be?:)

ig-sinicyn avatar May 10 '19 19:05 ig-sinicyn

I guess there will be no progress here. Is it still relevant or there is a solution in different library ?

NN--- avatar May 01 '20 15:05 NN---

Let's close it. Perftest are currently dead as I have no time to revive them:)

ig-sinicyn avatar May 01 '20 16:05 ig-sinicyn