Emit Debug Info for Local Consts that have been Optimized Down
Zig Version
0.13.0
Steps to Reproduce and Observed Behavior
Hello! I noticed that in an example program such as this:
const std = @import("std");
const print = std.debug.print;
fn varOne() i32 {
return 0x123456;
}
pub fn main() !void {
const var_one = 0xaabbaabb;
// const var_one = varOne();
print("var_one: {d}\n", .{var_one});
}
The compiler makes the correct choice to push the use of that constant down in to the only place it's used, which is deep within the print call (specifically it's used in formatInt):
0000000001083ee0 <fmt.formatIntValue__anon_8666>:
) !void {
1083ee0: 55 push %rbp
1083ee1: 48 89 e5 mov %rsp,%rbp
1083ee4: 48 83 ec 10 sub $0x10,%rsp
1083ee8: 49 89 d1 mov %rdx,%r9
1083eeb: 49 89 f0 mov %rsi,%r8
1083eee: 48 89 7d f0 mov %rdi,-0x10(%rbp)
break :blk @as(Int, value);
1083ef2: c7 45 fc bb aa bb aa movl $0xaabbaabb,-0x4(%rbp)
return formatInt(int_value, base, case, options, writer);
If I swap around the comments and call varOne, then the variable is stack allocated and everything looks as expected in the output binary and in the debug info.
The problem at hand is, as an author of a debugger, it means I'm not able to map that source-level constant to anything that could be rendered to the user to indicate its status. For instance, in the locals window here, I have a program that just allocates one of every primitive type plus a couple array/slice types, but the constant values ap and aq are missing, which confused me:
Expected Behavior
Even though in this easy to understand example, anyone could easily see var_one has just been optimized out, it still would be nice to have a DW_TAG_variable on that function declaration perhaps with a DW_OP_consts value (or similar for i.e. a variable that is a const instance of a struct) so I can easily display it in the local and watch window. I'm not quite sure if there exist similar notions in other debug formats; I'm really only familiar with DWARF. I'm also sure this gets more and more complicated as the type gets more complex.
Thanks for considering!
The cause for this is that var_one has type comptime_int; if you provide an explicit type (with const var_one: i32 = ... or ... = @as(i32, 0xaabbaabb)) you should see the variable getting debug info again. Maybe we do want to generate debug info for values of otherwise comptime-only type, but in the general case, that would often not make much sense—even if we drop type values and only cover comptime_int and comptime_float, could debuggers even deal with big integers sensibly?
Is this the LLVM backend (default)?
Like tau-dev said the issue is the type being comptime_int: https://github.com/ziglang/zig/blob/f2bf6c1b11702179329a4693cea429d550f519e1/src/Sema.zig#L6716
Using const var_one: i32 = 0x123456; to match calling varOne has debug info as expected.
Using comptime_int could be made to work by having Sema assign the value an appropriately sized int type.
This was with the LLVM backend, yes.
Is this the LLVM backend (default)?
I just wanted to note that this occurs with both backends. I don't believe this is anything LLVM specific.