Increasing Bitcoin block size will affect those validating with ‘constrained bandwidth’
Bitcoin Block Size.
These three words have been at the center of any discussion involving Bitcoin for the past few years; that is, any discussion not involving a debate on Bitcoin’s price. With split opinions between important proponents, something that eventually led to the creation of different assets, Bitcoin’s ecosystem continues to be divided on the question of what is an ideal block size.
With important Bitcoin core developers often taking the side of zero-alteration to the current size, Clark Moody, founder of the dashboard, recently gave his opinion on the debate surrounding block size.
In a recent podcast with Stephen Livera, Moody argued that even though hard drive space is cheap in the market, the implications of increasing Bitcoin block size must be looked at from a global perspective. According to him, if the block size is too large, several validators will not be able to keep up.
One of the major issues with increasing block size in the fact that certain nodes or validators with constrained bandwidth will not be able to catch up. Constrained bandwidth is basically the limit to operate at a certain speed, and it varies from place to place with respect to geographical locations. Increasing bandwidth strength can get really expensive in certain countries and it would take more resources to validate blocks, he went on to point out.
“If you can’t validate each block on average in 10 minutes or you know, download it and validated in 10 minutes, you’re going to fall behind on average and you’ll never actually catch up if the changes keep rolling on ahead.”
Earlier in 2019, Luke Dashjr, an independent Bitcoin core developer, had also proposed that Bitcoin’s current block size of 1 MB should be reduced down to 300 KB in order to make it easier for users to operate a full node on the Bitcoin Network.
Proponents arguing for an increased block size usually argue that in order to improve Bitcoin’s scalability and increase transactions per second, it is imperative to alter the block size. These proponents also believe that off-chain solutions such as Lightning, at the moment, are incapable of taking the load off the main chain.
However, the major drawback of larger blocks is that it may damage Bitcoin’s ethos of decentralization. Larger blocks will make full nodes very expensive to operate, contributing to fewer hashers running full nodes. Users with more hashing power will lead to centralized entities, and that may eventually weaken Bitcoin’s value proposition as a decentralized entity.