api
api copied to clipboard
Block with more than u16 events fails to be decoded
-
I'm submitting a ...
- [x] Bug report
-
What is the current behavior and expected behavior?
When a block has more than 64 * 1024 events, it fails to be decoded by the api and makes other component panic (like the polkadot UI).
To reproduce, I connected to a local polkadot node 1.0.0 and sent 4 batch of 8500 remark with events all in the same block.
The node will process it correctly, but on the api side, if i try to decode the events, i will get an error :
Error: Unable to decode storage system.events:: createType(Vec<FrameSystemEventRecord>):: Vec length 72264 exceeds 65536
This is coming from this file.
Here's also my initial question on Stack exchange.
- What is the motivation for changing the behavior?
With the rise of new types of chain, rollups and even inscriptions, block can definitely contain a lot of events and polkadot api is the main library to interract with such chains. This issue is not reproduced in subxt for example. If it's a choice, I would be interested to know the motivations.
-
Please tell us about your environment:
-
Version:
- Polkadot API : ^10.10.1
- Polkadot node : v1.0.0
-
Environment: Ubuntu 22.04
- [x] Node.js
-
Language:
- [x] TypeScript (include tsc --version : ^5.2.2)
-
Hi @IkerAlus is there an update here? It's starting to impact us
I am happy to start taking a look at this, but that being said, I am hesitant to just change the MAX_LENGTH
for Vec's without some heavy testing.
Not sure what the residual affects could be, but I am sure Jaco put that there for good reason (I hope).
Pushing this issue to the top of the queue, I'll be expediting this today and tomorrow.
The source of the changes above: https://github.com/polkadot-js/api/pull/2670
This is also the first time it was introduced: https://github.com/polkadot-js/api/commit/7b04ea0bfeea7b3fc23abde32fe46fe8c0676bc3#diff-0d925a4fc950736275a23f6d43d19c518deab3978a724752c54ad22202f7454f
Hey @TarikGul regarding the new value - would be ideal to increase it at least to 256k as we have seen chains with 144k events in a single block already in production I am thinking about maybe 512k as a safe ground that will allow this problem to not be brought up in the near future (hopefully). What do you think?
Hey @TarikGul regarding the new value - would be ideal to increase it at least to 256k as we have seen chains with 144k events in a single block already in production I am thinking about maybe 512k as a safe ground that will allow this problem to not be brought up in the near future (hopefully). What do you think?
That would be ideal and I hope it's that straight forward: I am looking into the feasibility right now. Currently it uses compactFromU8aLim
which has limitations attached to it which is where I think the original MAX_VALUE
comes from.
Made some local scripts to test, and work on this. Will post some of the workthrough I've done tomorrow. But for now its getting late!
This is the script I am currrently using:
require('@polkadot/api-augment');
const { ApiPromise, WsProvider } = require('@polkadot/api');
const { Keyring } = require('@polkadot/keyring');
const { cryptoWaitReady } = require('@polkadot/util-crypto');
const main = async () => {
await cryptoWaitReady();
const keyring = new Keyring();
const alice = keyring.addFromUri('//Alice', { name: 'Alice' }, 'sr25519');
const api = await ApiPromise.create({
provider: new WsProvider('ws://127.0.0.1:9944')
});
const txs = [];
for (let i = 0; i < 8500; i++) {
txs.push(api.tx.system.remark('0x00'))
};
const batches = [
api.tx.utility.batch(txs),
api.tx.utility.batch(txs),
api.tx.utility.batch(txs),
api.tx.utility.batch(txs)
]
await api.tx.utility.batchAll(batches).signAndSend(alice);
};
main().finally(() => process.exit());
So i created a successful transaction, and submitted it with 34k events.
Then using sidecar (just a RESTful wrapper around pjs to query and see if I would get any errors querying it, and I didnt).
Next: I'll try to decode by scratch, and also increase the amount of data in each remark.
Hey @TarikGul you can use this block: https://bittensor.com/scan/block/3014340 to test decoding of 144k events
@valentunn What chain is that for?
@TarikGul it's for Bittensor, sending you some RPC endpoints to try out.
@TarikGul Hey, any news on this ?
@Leouarz No, nothing more than what I had in the above. Unfortunately things moved ahead of this in the queue that are very high priority, I hope to tackle this again once things cool down.
I've been looking into it. I have reproduced it locally by increasing the amount of items in the snippet above:
(...)
for (let i = 0; i < 10_922; i++) { // utility.batchedCallsLimit
txs.push(api.tx.system.remark('0x00'))
};
const batches = [
api.tx.utility.batch(txs),
api.tx.utility.batch(txs),
api.tx.utility.batch(txs),
api.tx.utility.batch(txs),
api.tx.utility.batch(txs),
api.tx.utility.batch(txs),
api.tx.utility.batch(txs),
api.tx.utility.batch(txs),
api.tx.utility.batch(txs),
api.tx.utility.batch(txs),
api.tx.utility.batch(txs),
]
(...)
I've been looking into how compactFromU8aLim
works. Can't be 100% sure but I believe it's OK to increase our limit - https://github.com/polkadot-js/api/pull/5947
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue if you think you have a related problem or query.