23

Ethereum to Increase the Blocksize by 8x

 4 years ago
source link: https://www.tuicool.com/articles/UzQrAfN
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Ethereum is partially addressing the many complexities of sharding by simply increasing the blocksize from the equivalent of about 1MB every ten minutes, to circa 8MB. Danny Ryan, the ethereum 2.0 coordinator, publicly said :

“We are making the blocks bigger based on recent research on safe block size and propagation times, so the data availability of the system is still > 1MB/s so you can still get similar scalability gains when doing things like ZKrollup and OVM.”

ZK rollups are a hybrid scaling method thatcombines on-chain security and second layer networks through smart contracts and zero knowledge methods.

OVM is the Optimistic Virtual Machine from Plasma, with both being more sort of on top of ethereum’s public blockchain.

The blockchain itself is to be sharded with each shard, very simplistically said, being kind of like the current ethereum network.

There were meant to be 1024 shards, meaning capacity would have been the current network times 1024. Thus theoretically ethereum 2.0 would have had the ability to process one billion transactions a day, as opposed to just about one million currently.

That design has been changed considerably, with just 64 shards now proposed, shards that are a lot more merged together.

However the data limit will be increased by 8x. So 512 times one million in theory, making it about half a billion transactions a day. Ryan says :

“To achieve a similar scalability as the previous proposal, target shard block sizes are being increased 8x, from 16kB to 128kB.

This provides the system with greater than 1 MB/s of data availability which synergizes well with promising L2 schemes…

The network safety of these larger shard block sizes are justified by recent experimental research done on the existing Ethereum network.”

The research he is referring to appears to focus only on uncle orphan rates, without considering things like sync times or what in this design may be bottlenecks elsewhere as the 64 shards need to share a lot of data.

Nor is it clear why, based on that research, ethereum devs are not changing the defaults for the current network to 8x higher, expecting miners to follow such defaults through inertia or the game mechanics it would add.

What is somewhat clear however is that it is turning out to be difficult to diverge from Satoshi Nakamoto’s suggested way of scaling , with it all kind of coming full circle save for the many efficiencies and compressions that can be gained if one is focused on actually scaling the network through things like pruning and much else.

Copyrights Trustnodes.com


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK