savi
savi copied to clipboard
`\0` in a byte string literal does not encode NUL
Currently:
assert: b"foo\0_bar\0" == b"foo\x5c\x30_bar\x5c\x30"
Expected:
assert: b"foo\0_bar\0" == b"foo\x00_bar\x00"
Maybe there is good reason for this.
Might also apply to Strings.
Maybe there is good reason for this.
No, there's no reason apart from me just not knowing that \0 was a common escape sequence that people expect to be able to use.
I've never used the escape sequence \0 in another language - just to avoid me needing to search around a lot for what you already know, what language(s) are you used to using this escape sequence in?
I have no problem with us adding this escape sequence if it's common enough in other languages that people expect it.
\0 is used in quite some languages: C, C++, Ruby, Python, Zig, ...
You can roughly say, if \n is an escape sequence in that language, then \0 is.
I don't need \0 as escape sequence, as it's use is rather rare! But, I'd make everything that follows \ special, and if it is unknown, I'd rather emit a compiler error than encode it as is. Otherwise, some people will use \0 (like I did :) and if they don't have tests will wonder why they get two characters \ and 0 instead of NUL.
Given the large number of languages where it's present, I'm definitely okay with us adding \0 if it is desired.
I also agree with you that we should issue a compilation error when an unknown escape sequence is encountered, in a String or Bytes literal. I'll prepare a quick PR to add that as a compilation error.
PR #353 adds a nice compilation error on invalid escape characters.
I'll leave this ticket open in case you want to do a PR to add \0 as a valid escape sequence.