zig
zig copied to clipboard
zig test no output
Zig Version
0.9.0-dev.1737+c42763f8c
Steps to Reproduce
mkdir repro
cd repro
zig init-exe
mkdir subdir
mkdir other_subdir
touch subdir/solution.zig
touch other_subdir/solution.zig
edit src/Main.zig
to
const std = @import("std");
const solution = @import("./subdir/solution.zig");
pub fn main() anyerror!void {
std.log.info("All your codebase are belong to us.", .{});
solution.solve();
}
test "basic test" {
try std.testing.expect(true);
}
edit src/subdir/solution.zig
to:
const std = @import("std");
const shared = @import("./../other_subdir/shared.zig");
pub fn solve() void {
std.log.info("in solve", .{});
shared.some_func();
}
test "solution test" {
try std.testing.expect(true);
}
edit src/other_subdir/shared.zig
to
const std = @import("std");
pub fn some_func() void {
std.log.info("in some_func", .{});
}
then run
zig test src/Main.src
>> All 1 tests passed.
zig test ./src/subdir/solution.zig
>> <nothing outputs>
then go into ./src/subdir/solution.zig
and change it to
const std = @import("std");
// const shared = @import("./../other_subdir/shared.zig");
pub fn solve() void {
std.log.info("in solve", .{});
// shared.some_func();
}
test "solution test" {
try std.testing.expect(true);
}
and rerun
zig test src/Main.src
>> All 1 tests passed.
zig test ./src/subdir/solution.zig
>> All 1 tests passed.
Expected Behavior
I expect to be able to run zig test somefile/in/subdirs/file.zig
to detect and run the tests in <..>/file.zig
regardless of whether @import
is used
Actual Behavior
zig test somefile/in/subdirs/file.zig
is silently running, and not apparently doing anything once the tests compile.
The tests are detected, because removing the test "solution test" { ... }
block, zig test <..>/solution.zig
correctly tells me that no tests are detected. Any compiler errors are properly detected here.
This test case can be reduced to
mkdir dir1
mkdir dir2
# insert file1.zig
# insert file2.zig
# pwd = xxx/src
zig test dir1/file1.zig # success
cd dir1; # pwd = xxx/src/dir1
zig test file1.zig # failure
// dir1/file1.zig
const std = @import("std");
//const shared = @import("./../dir/file2.zig");
const shared = @import("../dir2/file2.zig");
test "solution test" {
try std.testing.expect(true);
}
// dir2/file2.zig
const std = @import("std");
Unfortunately root source files are not yet documented, which would clarify how this is not a bug.
This test case can be reduced to ... Unfortunately root source files are not yet documented, which would clarify how this is not a bug.
Sorry, I'm not sure I understand. Does this mean you can't run tests if they're not toplevel files? For my case, if I just cd next to the file in the subdir I'm still not getting the test to run.
$ cd <..>\src\subdir
$ zig test .\solution.zig
<nothing prints after it compiles>
Check out the build.zig file. This is where the build step that you are running with zig build test
is defined. It specifies a root source file. That file then must @import
anything that it wants to be tested. It has nothing to do with subdirectories.
Thanks, that makes sense.
I expected some sort of output like 'no tests found' or something, like it normally does if it can't find one.
Without any output at all, it's unclear what's happening as a newbie. I just expected to see either: 'tests found and tests passed/failed' or 'no tests found'. I didn't expect there to be a third valid outcome where nothing is output.
Thanks again for clarifying that it was an incorrectly set up build.zig!
zig test
will print "No tests to run" if there are no tests found. If it finds tests it will print "X tests passed" or similar. No output is unexpected. Apologies for missing this in your report.
This issue can be reproduced as of 0.11.0 using nothing more than init-exe
.
Steps to Reproduce
zig init-exe
zig build test
after the last command I get no output in version 0.11.0-dev.2680+a1aa55ebe. Interesting the correct behaviour occurs in version 0.10.1: All 1 tests passed.
, at least using this default program.
Furthermore zig test src/main.zig
works in both versions I tested.
I got zig version 0.11.0 from the prebuilt binaries on the site.
@LincePotiguara this is expected. When everything works successfully, it is traditional for command line applications to print nothing. If you would like to see a summary, you can pass -fsummary
, like this: zig build test -fsummary
.
I would have thought by default you'd get a green 'all tests passed' instead of absolutely nothing by default, but I understand that some CLIs return absolutely nothing when they're successful too.
@LincePotiguara this is expected. When everything works successfully, it is traditional for command line applications to print nothing. If you would like to see a summary, you can pass
-fsummary
, like this:zig build test -fsummary
.
It seems a bit inconsistent with doing this:
zig init-exe
zig test src/main.zig
which outputs:
All 1 tests passed.
Is this intended?
Hey @lawrence-laz ,
You may specify the std.testing.log_level
, I guess.
const std = @import("std");
test "nothing" {
std.testing.log_level = .info;
std.log.info("nothing", .{});
std.log.debug("nothing", .{});
std.log.warn("nothing", .{});
try std.testing.expect(true);
}
which creates this output:
pseudoc $ zig test main.zig
Test [1/1] test.nothing... [default] (info): nothing
[default] (warn): nothing
All 1 tests passed.
@LincePotiguara this is expected. When everything works successfully, it is traditional for command line applications to print nothing. If you would like to see a summary, you can pass
-fsummary
, like this:zig build test -fsummary
.
The new command as of 0.11
is zig build test --summary all
Comming from other languages and testing frameworks is kinda odd for me that zig build test
by default has no output, and even with zig build test --summary all
it gives something on the lines of:
# Not cached
❯ zig build test --summary all
Build Summary: 3/3 steps succeeded; 1/1 tests passed
test success
└─ run test 1 passed 2ms MaxRSS:3M
└─ zig test Debug native success 977ms MaxRSS:156M
# cached
❯ zig build test --summary all
Build Summary: 3/3 steps succeeded
test cached
└─ run test cached
└─ zig test Debug native cached 18ms MaxRSS:17M
I would expect to at least tell me what file name the test run from and the name of the test, what is the reason to have test "basic add functionality"
if there is no indication of that being used anywhere when the tests are run.
IIRC -fsummary
worked before and I think --summary all
worked too at least once but it does not work anymore (0.12.0-dev.1849+bb0f7d55e
) and I had to use std.debug.print()
instead.
Comming from other languages and testing frameworks is kinda odd for me that
zig build test
by default has no output
Agreed. I've been learning a little about zig the last couple days, and tried a few things with a friend of mine. We both were confused that zig test
outputs nothing by default when the test passed. I usually use @mesonbuild these days. No extra flags are required to display a summary. As I recall, summaries are printed without extra flags also with CMake, cargo, and using the test driver with automake.
Comming from other languages and testing frameworks is kinda odd for me that
zig build test
by default has no outputAgreed. I've been learning a little about zig the last couple days, and tried a few things with a friend of mine. We both were confused that
zig test
outputs nothing by default when the test passed. I usually use @mesonbuild these days. No extra flags are required to display a summary. As I recall, summaries are printed without extra flags also with CMake, cargo, and using the test driver with automake.
I concur, coming from the Python world. Pytest prints a detailed test report on completion, including a simple summary of what happened: ' 777 passed, 3 skipped, 13 warnings in 2.35s`. I feel the summary is required to be certain that tests are actually running and the number you expect is correct.
Details of the platform the tests were run on and versions are also useful information if the test report is required for later validation/review.
The percentage complete is less useful for zig test due to it being MUCH faster.