reth
reth copied to clipboard
debug_traceCall
Describe the bug
I believe debug_tracaCall in RETH doesn't return storage keys required . Not sure if the problem is in REVM or Tracers crate.
Here's a request that returns different data from GETH and RETH.
Inside the call, one of contract reverts, however calling contract handles this revert. Geth returns number of storage cells required for this call, but Reth doesn't.
Steps to reproduce
Try following request with RETH and GETH
I believe you can set block number instead of "latest" serialized request request={"method":"debug_traceCall","params":[{"to":"0x9980ce3b5570e41324904f46a06ce7b466925e23","gas":"0xf4240","input":"0x2764cd0b000000000000000000000000352b186090068eb35d532428676ce510e17ab58100000000000000000000000000000000000000000000000009d09e34743bd41f000000000000000000000000000000000000000000000000000000000000000100000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000"},"latest",{"tracer":"prestateTracer","tracerConfig":{"diffMode":false}}],"id":8,"jsonrpc":"2.0"}
Node logs
No response
Platform(s)
Linux (x86)
What version/commit are you on?
reth Version: 0.2.0-beta.6-dev Commit SHA: 2334317dc Build Timestamp: 2024-05-02T07:47:43.588097501Z Build Features: jemalloc Build Profile: maxperf
What database version are you on?
Current database version: 2 Local database is uninitialized
Which chain / network are you on?
1 - Ethereum mainnet
What type of node are you running?
Archive (default)
What prune config do you use, if any?
No response
If you've built Reth from source, provide the full command you used
RUSTFLAGS="-C target-cpu=native" cargo build --profile maxperf --features jemalloc
Code of Conduct
- [X] I agree to follow the Code of Conduct
thanks, checking, must be in the prestate tracer logic
ref https://github.com/bluealloy/revm/pull/1437
is this resolved @mattsse / @dexloom ?
I believe this was fixed via revm https://github.com/bluealloy/revm/pull/1437
optimistically closing
please reopen if you're still encountering this @dexloom
Looks like fixed!