c3c icon indicating copy to clipboard operation
c3c copied to clipboard

threaded test_suite_runner.c3

Open ManuLinares opened this issue 1 week ago • 5 comments

Changes:

  • Added a simple threadpool
  • Fixed the status line updates
  • Implemented the #skip for tests
  • Added ansi color to the final status line

It works as one expects reducing the total runner time by the allocated number of threads.

ManuLinares avatar Dec 13 '25 04:12 ManuLinares

How do you use this now? I tried to run the standard library unit tests with a compiler built from this branch: time ../build/release/c3c compile-test unit/ But the time is about the same for this PR as for the master branch, so no noticeable difference.

BWindey avatar Dec 13 '25 21:12 BWindey

The problem is that this is currently printing things out of order. Note that having [current/failed] rather than [passed/total] is essential for being able to narrow down issues with CI, please don't change that. Also note that when something fails it will print out a lot of important information, so that too cannot be out of order.

lerno avatar Dec 13 '25 22:12 lerno

The problem is that this is currently printing things out of order. Note that having [current/failed] rather than [passed/total] is essential for being able to narrow down issues with CI, please don't change that. Also note that when something fails it will print out a lot of important information, so that too cannot be out of order.

Hi, when something fails it prints out the same info as the previous implementation. But for the purpose of --no-terminal I could buffer the output and add [passed/total].

How do you use this now? I tried to run the standard library unit tests with a compiler built from this branch: time ../build/release/c3c compile-test unit/ But the time is about the same for this PR as for the master branch, so no noticeable difference.

Unit are very few tests, it's understandable the time is similar. When running it through the whole test suite it is significantly faster. Exactly time/threads. It cuts the CI runner in half or less if tweaked.

ManuLinares avatar Dec 13 '25 22:12 ManuLinares

Latest output looks like this:

- 21/1298 /home/runner/work/c3c/c3c/test/test_suite/union/union_codegen_empty.c3t: Passed.
- 22/1298 /home/runner/work/c3c/c3c/test/test_suite/union/designated_union_zeroing.c3t: Passed.
- 23/1298 /home/runner/work/c3c/c3c/test/test_suite/union/inferred_size_vector.c3: Passed.
Passed.
- 24/1298 /home/runner/work/c3c/c3c/test/test_suite/slices/slice_conv_byte.c3t: - 25/1298 /home/runner/work/c3c/c3c/test/test_suite/slices/slice_len_error.c3: Passed.
- 26/1298 /home/runner/work/c3c/c3c/test/test_suite/slices/slice_optional_index.c3t: Passed.

lerno avatar Dec 13 '25 23:12 lerno

I've corrected the output when using --no-terminal. It's now nicely ordered, and each line preserves the previous behavior of printing [current/failed] filename: status\n messages.

I don't know why OpenBSD fails, and I don't know why "msvc-debug" takes so long, so I disabled it.

The whole CI workflow took 17m 38s. Nice ;D On my PC, they run in under 30 seconds.

I'm happy with that. You mentioned in the past that everything would be faster if LLVM weren't init/deinit for each test, but that kind of compiler refactoring seemed too hard, honestly.

Please feel free to point out any changes you'd like to see made, and/or modify it to your liking, of course.

ManuLinares avatar Dec 15 '25 01:12 ManuLinares

With --no-terminal I get this:

- 1291/0 /Users/lerno/Projects/c3c/test/test_suite/precedence/required_parens.c3: - 1292/1 /Users/lerno/Projects/c3c/test/test_suite/lexing/invalid_hex_in_hexarray.c3: - 1293/2 /Users/lerno/Projects/c3c/test/test_suite/lexing/expected_directive.c3: - 1294/3 /Users/lerno/Projects/c3c/test/test_suite/lexing/no_builtin.c3: - 1295/4 /Users/lerno/Projects/c3c/test/test_suite/lexing/too_long_ident.c3: - 1296/5 /Users/lerno/Projects/c3c/test/test_suite/lexing/invalid_hex_in_hexarray2.c3: - 1297/6 /Users/lerno/Projects/c3c/test/test_suite/interfaces/interface_multi.c3: - 1298/7 /Users/lerno/Projects/c3c/test/test_suite/interfaces/interface_test.c3: 
Found 8 tests: 0.0% (0 / 8) passed (0 skipped, 0 failed).

lerno avatar Dec 19 '25 15:12 lerno

Oddly enough I get this even with --no-terminal:

Found 8 tests: 0.0% (0 / 8) passed (0 skipped, 0 failed).

It's just placed as an stderr output.

lerno avatar Dec 19 '25 15:12 lerno

This is on a terminal not recognizing ansi, this shows where that output suddenly appears:

[Testing: 1/1299 | Passed: 0 | Failed: 0] 
[Testing: 2/1299 | Passed: 0 | Failed: 0] 
[Testing: 3/1299 | Passed: 0 | Failed: 0] 
[Testing: 4/1299 | Passed: 0 | Failed: 0] 
[Testing: 5/1299 | Passed: 0 | Failed: 0] 
[Testing: 6/1299 | Passed: 0 | Failed: 0] 
[Testing: 7/1299 | Passed: 0 | Failed: 0] 
[Testing: 8/1299 | Passed: 0 | Failed: 0] Found 8 tests: 0.0% (0 / 8) passed (0 skipped, 0 failed).

[Testing: 8/1299 | Passed: 1 | Failed: 0] 

lerno avatar Dec 19 '25 15:12 lerno

For this [Testing: 1/1299 | Passed: 0 | Failed: 0] ansi-based output, consider instead something like:

Test progress: [XXXXXXXXXXXXX------] 25% complete (2 failed) - Testing: foo.c3

lerno avatar Dec 19 '25 15:12 lerno

This is on a terminal not recognizing ansi, this shows where that output suddenly appears:

[Testing: 7/1299 | Passed: 0 | Failed: 0] 
[Testing: 8/1299 | Passed: 0 | Failed: 0] Found 8 tests: 0.0% (0 / 8) passed (0 skipped, 0 failed).

[Testing: 8/1299 | Passed: 1 | Failed: 0] 

can't reproduce but I'll remove the ansi colors

With --no-terminal I get this:

- 1291/0 /Users/lerno/Projects/c3c/test/test_suite/precedence/required_parens.c3: - 1292/1 /Users/lerno/Projects/c3c/test/test_suite/lexing/invalid_hex_in_hexarray.c3: - 1293/2 /Users/lerno/Projects/c3c/test/test_suite/lexing/expected_directive.c3: - 1294/3 /Users/lerno/Projects/c3c/test/test_suite/lexing/no_builtin.c3: - 1295/4 /Users/lerno/Projects/c3c/test/test_suite/lexing/too_long_ident.c3: - 1296/5 /Users/lerno/Projects/c3c/test/test_suite/lexing/invalid_hex_in_hexarray2.c3: - 1297/6 /Users/lerno/Projects/c3c/test/test_suite/interfaces/interface_multi.c3: - 1298/7 /Users/lerno/Projects/c3c/test/test_suite/interfaces/interface_test.c3: 

I will fix this. the newline is appended later, when no tests runned no newline. its definitively a bug.

Found 8 tests: 0.0% (0 / 8) passed (0 skipped, 0 failed).

Oddly enough I get this even with --no-terminal:

Found 8 tests: 0.0% (0 / 8) passed (0 skipped, 0 failed).


It's just placed as an stderr output.

I will check for this (are you using the latest pull?), weird because I don't output anything to stderr, with --no-terminal, all test output is buffered. Maybe there was some ansi weirdness in the failed test output.

Will try to fix and test in ¿Clion right?

ManuLinares avatar Dec 19 '25 16:12 ManuLinares

I've fixed CLion so it's ok, and tweaked the output. But for --no-terminal I want output for each as soon as it is done, if this means you have to print it like this, then so be it:

- 2/0 Running test_suite/macro_methods/macro_method_first_param.c3
- 3/0 Running test_suite/macro_methods/macro.c3
ERROR: Failure in test macro_method_first_param.c3
...
- 4/1 Running test_suite/macro_methods/testme.c3

In other words, we print when it starts, then we print if it fails.

Alternatively:

- 1/0 Running test_suite/macro_methods/macro_method_first_param.c3
- 1/0 Running test_suite/macro_methods/macro.c3
- 2/0 Test macro_method_first_param.c3 passed.
- 3/1 Test macro.c3 failed:
ERROR: Failure in test macro.c3
...
- 3/1 Running test_suite/macro_methods/testme.c3
- 3/1 Running test_suite/macro_methods/testme2.c3
- 4/1 Test macro_method_first_param.c3 passed.

So completed / failed. We can also imagine started / completed / failed

- 2/1/0 Running test_suite/macro_methods/macro_method_first_param.c3
- 3/1/0 Running test_suite/macro_methods/macro.c3
- 3/2/0 Test macro_method_first_param.c3 passed.
- 3/3/1 Test macro.c3 failed:
ERROR: Failure in test macro.c3
...
- 4/3/1 Running test_suite/macro_methods/testme.c3
- 5/3/1 Running test_suite/macro_methods/testme2.c3
- 5/4/1 Test macro_method_first_param.c3 passed.

lerno avatar Dec 19 '25 23:12 lerno

I want to add that the reason is that sometimes even the test runner crashes, and it's important to narrow down where. Also, seeing the progress is useful in some cases when one wants to abort the CI

lerno avatar Dec 19 '25 23:12 lerno

I want to add that the reason is that sometimes even the test runner crashes, and it's important to narrow down where. Also, seeing the progress is useful in some cases when one wants to abort the CI

Makes sense, although the columns [test_number/failed_tests] wont be accurate:

test_number its ok, nevermind the order. failed_tests will be almost always wrong (like 99% of the time wrong), the nature of threading.

We could print something like [test_number/status] filename where "status" would be one of "started/passed/failed" so each test would have at least two lines (more if it fails). And the final status line when all finished.

ManuLinares avatar Dec 20 '25 01:12 ManuLinares

It's probably sufficient to just use the first one I listed:

  1. Increment the counter (and print) when a new test is started.
  2. Increment the error counter when a test has failed. So it's never the "test number", just the amount of tests that have been launched. And the error counter will be updated directly when a test fails.

lerno avatar Dec 20 '25 02:12 lerno

If we have three tests all ok:

- 1/0 - test1.c3
- 2/0 - test2.c3
- 3/0 - test3.c3
Found 3 tests: 100.0% (3 / 3) passed (0 skipped, 0 failed).

If the first fails, before launching the third:

- 1/0 - test1.c3
- 2/0 - test2.c3
- 2/1 - test1.c3 failed:
[ Error text here ]
- 3/0 - test3.c3
Found 3 tests: 66.7% (2 / 3) passed (0 skipped, 1 failed).

Both test 1 and 2 fails, out of order

- 1/0 - test1.c3
- 2/0 - test2.c3
- 2/1 - test2.c3 failed:
[ Error text here ]
- 2/2 - test1.c3 failed:
[ Error text here ]
- 3/0 - test3.c3
Found 3 tests: 33.3% (1 / 3) passed (0 skipped, 2 failed).

lerno avatar Dec 20 '25 02:12 lerno

Do you have time to work on this today you think?

lerno avatar Dec 20 '25 15:12 lerno

Do you have time to work on this today you think?

I may have time later but have a crack at it, I'll be on discord

ManuLinares avatar Dec 20 '25 19:12 ManuLinares

MSVC debug takes so long because non-optimized LLVM with full asserts and debug info is EXTREMELY slow. About two magnitudes slower than normal.

lerno avatar Dec 21 '25 01:12 lerno

MSVC debug takes so long because non-optimized LLVM with full asserts and debug info is EXTREMELY slow. About two magnitudes slower than normal.

Are you ok disabling test_suite_runner from msvc-debug CI workflow?

If not, could instead of Debug, RelWithDebInfo be the alternative?

ManuLinares avatar Dec 21 '25 06:12 ManuLinares

Thank you! I think this is enough now to work reliably, so I merged it.

lerno avatar Dec 21 '25 13:12 lerno