Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Block with more than u16 events fails to be decoded #5782

Open
3 tasks done
Leouarz opened this issue Jan 16, 2024 · 13 comments
Open
3 tasks done

Block with more than u16 events fails to be decoded #5782

Leouarz opened this issue Jan 16, 2024 · 13 comments
Assignees
Labels

Comments

@Leouarz
Copy link

Leouarz commented Jan 16, 2024

  • I'm submitting a ...

    • Bug report
  • What is the current behavior and expected behavior?

When a block has more than 64 * 1024 events, it fails to be decoded by the api and makes other component panic (like the polkadot UI).
To reproduce, I connected to a local polkadot node 1.0.0 and sent 4 batch of 8500 remark with events all in the same block.
The node will process it correctly, but on the api side, if i try to decode the events, i will get an error :
Error: Unable to decode storage system.events:: createType(Vec<FrameSystemEventRecord>):: Vec length 72264 exceeds 65536
This is coming from this file.
Here's also my initial question on Stack exchange.

  • What is the motivation for changing the behavior?

With the rise of new types of chain, rollups and even inscriptions, block can definitely contain a lot of events and polkadot api is the main library to interract with such chains. This issue is not reproduced in subxt for example. If it's a choice, I would be interested to know the motivations.

  • Please tell us about your environment:
    • Version:

      • Polkadot API : ^10.10.1
      • Polkadot node : v1.0.0
    • Environment: Ubuntu 22.04

      • Node.js
    • Language:

      • TypeScript (include tsc --version : ^5.2.2)
@IkerAlus IkerAlus added the bug label Mar 15, 2024
@jamesbayly
Copy link

Hi @IkerAlus is there an update here? It's starting to impact us

@TarikGul
Copy link
Member

I am happy to start taking a look at this, but that being said, I am hesitant to just change the MAX_LENGTH for Vec's without some heavy testing.

Not sure what the residual affects could be, but I am sure Jaco put that there for good reason (I hope).

@TarikGul TarikGul self-assigned this May 10, 2024
@TarikGul
Copy link
Member

TarikGul commented Jun 6, 2024

Pushing this issue to the top of the queue, I'll be expediting this today and tomorrow.

@TarikGul
Copy link
Member

TarikGul commented Jun 6, 2024

The source of the changes above: #2670

@TarikGul
Copy link
Member

TarikGul commented Jun 6, 2024

This is also the first time it was introduced: 7b04ea0#diff-0d925a4fc950736275a23f6d43d19c518deab3978a724752c54ad22202f7454f

@valentunn
Copy link

Hey @TarikGul regarding the new value - would be ideal to increase it at least to 256k as we have seen chains with 144k events in a single block already in production
I am thinking about maybe 512k as a safe ground that will allow this problem to not be brought up in the near future (hopefully). What do you think?

@TarikGul
Copy link
Member

TarikGul commented Jun 7, 2024

Hey @TarikGul regarding the new value - would be ideal to increase it at least to 256k as we have seen chains with 144k events in a single block already in production I am thinking about maybe 512k as a safe ground that will allow this problem to not be brought up in the near future (hopefully). What do you think?

That would be ideal and I hope it's that straight forward: I am looking into the feasibility right now. Currently it uses compactFromU8aLim which has limitations attached to it which is where I think the original MAX_VALUE comes from.

@TarikGul
Copy link
Member

Made some local scripts to test, and work on this. Will post some of the workthrough I've done tomorrow. But for now its getting late!

@TarikGul
Copy link
Member

TarikGul commented Jun 11, 2024

This is the script I am currrently using:

require('@polkadot/api-augment');

const { ApiPromise, WsProvider } = require('@polkadot/api');
const { Keyring } = require('@polkadot/keyring');
const { cryptoWaitReady } = require('@polkadot/util-crypto');

const main = async () => {
	await cryptoWaitReady();

	const keyring = new Keyring();
	const alice = keyring.addFromUri('//Alice', { name: 'Alice' }, 'sr25519');

    const api = await ApiPromise.create({
        provider: new WsProvider('ws://127.0.0.1:9944')
    });

    const txs = [];
    for (let i = 0; i < 8500; i++) {
        txs.push(api.tx.system.remark('0x00'))
    };

    const batches = [
        api.tx.utility.batch(txs),
        api.tx.utility.batch(txs),
        api.tx.utility.batch(txs),
        api.tx.utility.batch(txs)
    ]

    await api.tx.utility.batchAll(batches).signAndSend(alice);
};

main().finally(() => process.exit());

@TarikGul
Copy link
Member

TarikGul commented Jun 11, 2024

So i created a successful transaction, and submitted it with 34k events.

Screenshot 2024-06-11 at 7 56 09 PM

Then using sidecar (just a RESTful wrapper around pjs to query and see if I would get any errors querying it, and I didnt).

Screenshot 2024-06-11 at 7 57 14 PM

Next: I'll try to decode by scratch, and also increase the amount of data in each remark.

@valentunn
Copy link

Hey @TarikGul you can use this block: https://bittensor.com/scan/block/3014340 to test decoding of 144k events

@TarikGul
Copy link
Member

@valentunn What chain is that for?

@KarimJedda
Copy link

@TarikGul it's for Bittensor, sending you some RPC endpoints to try out.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Development

No branches or pull requests

6 participants