[Bug] Applying N entity operation(s) log `block_hash` broken on Substreams
Bug report
I'm running a Substreams only config https://github.com/streamingfast/graph-node-dev/blob/master/config/substreams-generic.toml.
I started Uniswap V3 Substreams powered Subgraph on this. When the log Applying 22 entity operation(s) is displayed, the block_hash value is wrong. We get the hex representation of a String hexadecimal value, essentially something like this happen: let input = b"61c558e1ba724af7924f849c0dc390037b529a0c1b37ff277c6b34c180d8d6ed"; hex::encode(input).
I imagine this is specific to chain/substreams, I never saw that before on other chain(s) implementation.
This is against master branch.
Relevant log output
Oct 27 16:49:57.987 INFO Applying 78 entity operation(s), block_hash: 0x31383536616466343230303235656561616330393963653762643532303266666266383836613533383536343638303235626666643634316462323433323133, block_number: 12388136, sgd: 1, subgraph_id: QmQJovmQLigEwkMWGjMT8GbeS2gjDytqWCGL58BEhLu9Ag, component: SubgraphInstanceManager
Oct 27 16:49:58.071 INFO Applying 72 entity operation(s), block_hash: 0x61346665633532336133383263353639313739303939656535623034366431636537363864306239366334316266326436343632393238613339373432616133, block_number: 12388137, sgd: 1, subgraph_id: QmQJovmQLigEwkMWGjMT8GbeS2gjDytqWCGL58BEhLu9Ag, component: SubgraphInstanceManager
Oct 27 16:49:58.140 INFO Applying 59 entity operation(s), block_hash: 0x37663831656262323139316663633262353433386631643930326139653735336435636330623439313430373635666632313237313062633061303934376562, block_number: 12388138, sgd: 1, subgraph_id: QmQJovmQLigEwkMWGjMT8GbeS2gjDytqWCGL58BEhLu9Ag, component: SubgraphInstanceManager
Oct 27 16:49:58.220 INFO Applying 56 entity operation(s), block_hash: 0x37623963333664663938393031343562333161383461666437643133646262336230373461363438373531363838346365633838336338656563646362613838, block_number: 12388139, sgd: 1, subgraph_id: QmQJovmQLigEwkMWGjMT8GbeS2gjDytqWCGL58BEhLu9Ag, component: SubgraphInstanceManager
IPFS hash
No response
Subgraph name or link to explorer
No response
Some information to help us out
- [ ] Tick this box if this bug is caused by a regression found in the latest release.
- [ ] Tick this box if this bug is specific to the hosted service.
- [X] I have searched the issue tracker to make sure this issue is not a duplicate.
OS information
None
$ to_ascii 31383536616466343230303235656561616330393963653762643532303266666266383836613533383536343638303235626666643634316462323433323133
1856adf420025eeaac099ce7bd5202ffbf886a53856468025bffd641db243213
And 1856adf420025eeaac099ce7bd5202ffbf886a53856468025bffd641db243213 is the correct block hash for block number 12388136.
cc @mangas
Also ran into it. This PR should fix it I think: https://github.com/graphprotocol/graph-node/pull/4967
Much appreciated @YaroShkvorets
@maoueh from a near substreams I get the following on the proto Clock.id:
number: 9820431
id: 886d431b0ff918e8f21e38af93952176b8d0d3f14c85aab2941c386b0ea945dd
hash(base64 decoded) f3ce9de37d5bd1f7fdd7c7bc7f6d5edfc69ff77f79db5efa6fc7747777f5e1cf3969a6f6f78d5cdfce9bd1e6bde3975d,
How is this encoded? The block hash should be ABZ5aMBM8B1Y6a4WirQJgacBT5r4FDBaDNWr38mzyPKa
Are these encoded different per chain?
Edit: I found the right answer here but this seems to require a larger fix since for near the expected operation is Hex decode + Base58 encoded in order to find the right hash value for the block
Yes indeed that's an interesting problem indeed and this is because substreams clock is chain agnostic and hence it couldn't know that it should base58 the block's hash and instead does a hex(block.hash).
$ to_base58 -hex 886d431b0ff918e8f21e38af93952176b8d0d3f14c85aab2941c386b0ea945dd
ABZ5aMBM8B1Y6a4WirQJgacBT5r4FDBaDNWr38mzyPKa
That's something I oversight when suggesting the clock thing.
Hashes are stored decoded as bytes, so it shouldn't matter how they arrive as long as they are decoded properly, no?
The issue is that Near wants block hash in Base58 so BlockHash::to_string() needs to be BlockchainKind-aware.
Looks like this issue has been open for 6 months with no activity. Is it still relevant? If not, please remember to close it.