parity-common
parity-common copied to clipboard
[uint, primitive-types]: Parsing from string is not what is expected
I was using U256
as an argument with clap
and I found out that my input was parsed as hexadecimal number.
I was expecting the following, however the first assert fails because num
is 39.
use primitive_types::U256;
fn main() {
let num: U256 = "27".parse().unwrap();
assert_eq!(num, 27.into());
let num: U256 = "0x27".parse().unwrap();
assert_eq!(num, 0x27.into());
}
We have from_dec_str
and from_str_radix
methods for parsing non-hexadecimal strings. By default, it's assumed to be hexadecimal.
Would you expect let num: U256 = "aa".parse().unwrap();
to fail to parse and only parse hexadecimal if it's prefixed with 0x?
Yes exactly.
That might be a better default, however, I don't know how many crates depend on the current behavior and it won't be checked at compile time. So I'm very hesitant to change it now. Maybe we could improve documentation instead to make it clear.
When deriving Deserialize
for API structs we ran into issues a few times because of this behavior. I understand that completely changing this behavior could be a serious breaking change for some users but would you maybe accept a PR that introduces an improved parsing behavior via a feature flag people would have to opt-into?
I would suggest the feature flag would allow number types to be parsed from decimal strings and 0x
prefixed hex strings (e.g. "1234" and "0xabcd1234").
I can't think of a precedent off top of my head where a behavior of an existing function could be changed via a feature flag without changing its API. But that sounds reasonable to me to add a feature disabled by default that does that.