nimbus-eth1
nimbus-eth1 copied to clipboard
`AssertionError` when syncing goerli
2eb46ca2212d5d6b9c8f5c78b282a2ec9bd1260b
nimbus --goerli --log-level=DEBUG
...
DBG 2021-04-08 15:39:59.194+02:00 Handshake message not delivered topics="rlpx" tid=811532 file=rlpx.nim:131 peer=Node[206.189.192.240:30311]
DBG 2021-04-08 15:39:59.194+02:00 Handshake message not delivered topics="rlpx" tid=811532 file=rlpx.nim:131 peer=Node[147.135.70.51:30304]
DBG 2021-04-08 15:39:59.643+02:00 executeOpcodes error topics="vm opcode" tid=811532 file=interpreter_dispatch.nim:393 msg="Opcode Dispatch Error msg=Out of gas: Needed 20000 - Remaining 2340 - Reason: SSTORE: be452a176d93b80202a0f1ae57829c2a79b579c7[19306861759201220936461974706836567259894967920184488036630826450862673086425] -> 100000000000000000000000 (0), depth=1"
DBG 2021-04-08 15:39:59.643+02:00 executeOpcodes error topics="vm opcode" tid=811532 file=interpreter_dispatch.nim:393 msg="REVERT opcode executed"
/home/arnetheduck/status/nimbus-eth1/nimbus/nimbus.nim(212) nimbus
/home/arnetheduck/status/nimbus-eth1/nimbus/nimbus.nim(174) process
/home/arnetheduck/status/nimbus-eth1/vendor/nim-chronos/chronos/asyncloop.nim(279) poll
/home/arnetheduck/status/nimbus-eth1/vendor/nim-chronos/chronos/asyncmacro2.nim(74) colonanonymous
/home/arnetheduck/status/nimbus-eth1/vendor/nim-eth/eth/p2p/blockchain_sync.nim(263) obtainBlocksFromPeer
/home/arnetheduck/status/nimbus-eth1/vendor/nim-eth/eth/p2p/blockchain_sync.nim(159) returnWorkItem
/home/arnetheduck/status/nimbus-eth1/vendor/nim-eth/eth/p2p/blockchain_sync.nim(90) persistWorkItem
/home/arnetheduck/status/nimbus-eth1/nimbus/p2p/chain.nim(136) persistBlocks
/home/arnetheduck/status/nimbus-eth1/vendor/nim-eth/eth/trie/db.nim(163) dispose
/home/arnetheduck/status/nimbus-eth1/vendor/nim-eth/eth/trie/db.nim(148) rollback
/home/arnetheduck/status/nimbus-eth1/vendor/nimbus-build-system/vendor/Nim/lib/system/assertions.nim(29) failedAssertImpl
/home/arnetheduck/status/nimbus-eth1/vendor/nimbus-build-system/vendor/Nim/lib/system/assertions.nim(22) raiseAssert
/home/arnetheduck/status/nimbus-eth1/vendor/nimbus-build-system/vendor/Nim/lib/system/fatal.nim(49) sysFatal
[[reraised from:
/home/arnetheduck/status/nimbus-eth1/nimbus/nimbus.nim(212) nimbus
/home/arnetheduck/status/nimbus-eth1/nimbus/nimbus.nim(174) process
/home/arnetheduck/status/nimbus-eth1/vendor/nim-chronos/chronos/asyncloop.nim(279) poll
/home/arnetheduck/status/nimbus-eth1/vendor/nim-chronos/chronos/asyncmacro2.nim(96) colonanonymous
]]
Error: unhandled exception: /home/arnetheduck/status/nimbus-eth1/vendor/nim-eth/eth/trie/db.nim(148, 12) `t.db.mostInnerTransaction == t and t.state == Pending` [AssertionError]
I wonder if this is related to REVERT
abusing the EVM setError
mechanism. If it is related to REVERT
after OOG
then this combination is not covered by eth-tests.
It's hard to tell from logs if something else happens after the REVERT
and before the crash, however, if the crash is indeed caused by error handling that REVERT
triggers, then my suspicion is the problem isn't REVERT
, because it could be triggered by any other VM error as well.
The revert path looks much like any other VM error exit. Changing how revert works might make this error go away, but likely hiding a bug that's still there for other error sequences (which presumably are rare but not never). I think it's better to try to make this reproducible.
My suspicion is something in the double rollback of the database state, when there's a depth=1
error return followed by a second depth=0
error return (assuming the two errors are for the same transaction).
I encounter no assertion error like this when syncing goerli, at least not after calling persistBlock