Vitalik: The core difficulties of blockchain scalability are computation, data, and state

Jan 27, 2026 09:52:59

Share to

Vitalik Buterin published an article explaining his layered understanding of blockchain scalability, pointing out that the difficulty of scaling blockchains increases from low to high in terms of computation, data, and state.

Vitalik stated that computation is the easiest to scale, which can be achieved through parallelization, introducing "hints" provided by block builders, or replacing extensive computation with proofs such as zero-knowledge proofs. Data scalability is of medium difficulty; if the system requires data availability guarantees, this requirement cannot be avoided, but it can be optimized through data sharding, erasure coding (like PeerDAS), and supporting "graceful degradation," meaning that blocks of a corresponding size can still be generated even when node data capabilities are lower.

In contrast, state is the most challenging part to scale. Vitalik pointed out that to validate even a single transaction, nodes require the complete state; even if the state is abstracted as a tree and only the root node is saved, updating that root still relies on the complete state. Although there are methods for state sharding, they typically require significant architectural changes and are not universal solutions.

Based on this, Vitalik concluded that if data can replace state without introducing new centralization assumptions, it should be prioritized; similarly, if computation can replace data without introducing new centralization assumptions, it should also be taken seriously.

Recent Fundraising

More
$6M 1月 28
-- 1月 27

New Tokens

More
1月 30
1月 28
3KDS 3KDS
1月 27

Latest Updates on 𝕏

More
1月 27
1月 27
1月 27