"Everyone ignores MegaETH and actually almost eliminates the gas limit of EVM" -@0x_ultra
This has attracted some attention on the X timeline - let's analyze how it works and its impact.
Typical blockchain networkFirst, let's outline the composition of traditional networks so that we can highlight the differences.
I will use an image to simplify the explanation (if this helps you, you can skip this part):
Common roles in blockchain networks: block producers, node networks, and users.
Now let's analyze the meaning of these roles.
Common network roles
Block producer
This is the entity responsible for creating blocks that can be attached to the chain.
For L1, this is a diverse and distributed collection of validators, randomly selected for this role, while for L2, common constructs will The role is handed over to a single machine: the sorter.
The key difference between the two parties that fill the block producer role is that the sorter usually has greater hardware requirements and either does not give up on the role, or This is rarely done, and validators are constantly rotated (for example, Solana's leader rotates after ~1.2 seconds).
full nodes
These machines receive block producers (whether they are validators or sorting) ) generates blocks, execute them themselves to verify their accuracy with the existing chain history, and then update their local "truth" to keep in sync with the chain itself.
Once synchronized, they can provide this information to application users, developers who want to obtain chain information, etc. This is the "network" of blockchain.
It should be noted that your network speed depends only on its slowest entity.
This means that if these entities that provide chain information cannot keep up with the blocks generated by the verifier/sorter and verify their correctness, then your network Will run at this slowdown.
user
This is you. When you read information from the application or submit transactions to the chain, all information is routed through a full node that is kept in sync with the block producer. This is self-evident.
Hardware protocolSo, these are all parties - very good. But what does this have to do with gas restrictions? To understand this, we must discuss the representation of gas and the other two extended dimensions in distributed networks.
In short, gas limits represent the complexity of on-chain computing or blocks, and are the promise of the network to its nodes: to keep up with what it generates Blocks, you only need X hardware to handle the generated blocks without falling behind. This is essentially a current limiting method.
However, this is not the only dimension that determines the throughput of a chain.
The other two influencing factors are:
Bandwidth-node upload/download speed, Enable it to communicate with other parts of the network
Storage- Hardware requirements for node storage chain information. The more history is processed, the more information it needs to be stored.
Together with computing, these constitute the implicit "hardware protocol" of the network:
< /p>
Three-dimensional expansion that affects network throughput
In traditional settings of cryptocurrencies, it is usually Let a single machine (full node) run in orphan state and be able to handle itThe maximum possible requirement for all three dimensions.
A full node must have:
Download/upload all blocks' bandwidth p>
The computing power of re-execute all transactions in all blocks
Storage capacity to store the entire chain state
In the above aspects, calculations are usually the most restrictive in average EVM networks, which is why block restrictions are roughly similar in well-distributed networks :
Table: Comparison of gas parameters on the EVM chain in 2024 (Source: Paradigm [https://www.paradigm.xyz/2024/04/reth-perf ])
< p style="text-align: left;">The problem is therefore determined as the computing power required by a single machine to keep up with the block producers on the chain.How to solve this problem? Node specialization.
Node Specialization: MEGAETH's answer"What the hell is node specialization?"
This just means we took the approach of splitting this traditional single entity (full node) into a group of professional machines serving a specific function.
Then: The full node must process the maximum bandwidth, compute and store results of the block producer.
Now: the full node is replaced with a replica node, which only receives state differences rather than the full block, and the full block is in the entire network of proof nodes Distribution, these nodes execute these blocks independently, and then report the proof of the block validity to the replica node.
Visualization:
Visualization of proven relationship between network and replica nodes
The above impact is:
Because the calculation (i.e. transaction complexity) is no longer the A single entity processes each block, but is spread over a group of machines in the proof network, so it is no longer the most pressing limitation dimension of extension, almost eliminating the possibility of being a constraint
The above content shifts the problem to bandwidth and storage, and the storage size is our current focus due to state growth. To solve this problem, we are iterating the pricing model based on the updated number of kvs rather than transaction complexity (gas) by splitting a single machine into a group machine, it injects some trust assumptions into this particular setup.
As for the last point, it is important to note that MegaETH will also provide full node options for those who want to verify 100% of the chain status themselves.
The latest node specification provided by MegaETH
Good, the calculation/gas limit is gone - what does this mean for me?
The impact of no gas limitAt the highest level, this only means "people can do more complex things on the chain", which is usually in contracts and transactions The strict size limits appear.
@yangl1996 Direct answer to @dailofrog (an avid on-chain artist):
In addition, there are some example categories:
Complex chain Computing
Run machine learning models directly in smart contracts
Real-time price calculation
Full sort large arrays without loop limit
Graph algorithms that can traverse the entire network/relationshipStorage and state management
maintain larger in-contract data structures
maintain more historical data in contract storage
Processing batch operations in a single transaction
Protocol design
left;">Run the complete zero-knowledge proof verification
Complex encryption operations without the need for off-chain components
Real-time automated market makers with complex formulas
Ultimately, this is just on-chain creativity. This is a shift in mindset, from scarcity, gas optimization and contract optimization to a rich EVM paradigm.
We will see how the team ultimately leverages it, but I think it will be something that the ecosystem has been quietly acclaimed for a long time.