Gavin Andresen Proposes 20 MB Block Size: Pros and Cons

If you’ve been following the core development of Bitcoin, you may have noticed an interesting change by one of the lead developers, Gavin Andresen. The change, which was in a GitHub commit in his personal repository, allowed for clients to accept and create 20MB blocks after March 1, 2016. This was quickly picked up by the Bitcoin community, which then launched into a heated debate as to whether or not this change is a good idea. Many supported the idea and agreed that it is necessary to the evolution of Bitcoin. Others, however, argued that it isn’t needed right now and that it would cause many other problems to arise. Below I’ve listed some of the pros and cons of the newly proposed block size, along with some other possible scenarios. Gavin has has started to address many of the different scenarios on his blog, some of which are explained below. It’s important to note that I’m not a complete expert on everything related to the Bitcoin protocol, so if my information is incorrect, please feel free to leave a comment!


First off, by increasing block size by any amount, the number of transactions that can be included in each block increases. This allows for the continued growth of Bitcoin over time. As the number of users increases, so does the number of transactions, so by increasing this number, Bitcoin opens itself up to increased adoption.

Another advantage that ties into this is that transactions can be verified more quickly. For example, as per, the more full blocks become, the smaller the probability that a transaction will be included. In the post, it states that at 80% block capacity, 50% of transactions would verify at around 18.5 minutes, on average. This is too slow for the vast majority of transactions, and it would also become easier (but still very difficult) to double-spend coins. Furthermore, the more unconfirmed transactions, the slower the network will perform. This is because of the limited memory each node has to store these unconfirmed transactions. If too many piled up, some could be dropped, resulting in an unreliable transaction confirmation system.

Finally, larger block size means that more transactions can be included, which in turn means more fees included per block. This could become key as the base block reward decreases. More fees would help incentivize miners to keep mining, thus keeping the network secure.


Of course, as with anything, there are negatives associated with the proposed 20MB blocks. First, it would become slower to broadcast new blocks due to the increase in amount of data to be sent. This could result in more orphaned blocks and inconsistencies between different clients.

Next, the UTXO, or Unspent Transaction Outputs, database would increase more rapidly as a result of the higher transaction cap. The database contains all the transaction outputs that haven’t been spent as of yet, so that it’s possible to validate transactions. According to, the size of the database has almost doubled in the last 11 months. Gavin’s blog post that addressed this says that with a 1MB cap, the maximum database size increase would be 50GB, but with 20MB it would be 20 times that. Meaning, each year the database size could potentially increase by 1000GB, or 1TB. That could be problematic for nodes that don’t have the capacity or funds to continuously increase their storage. If that’s the case, we could expect to see some nodes drop off the network, decreasing decentralization of the network.

The largest problem, however, would most likely be the hard fork required to change the specification. A hard fork is when the blockchain splits into two (or more) distinct chains as a result of different clients having different ideas of what the protocol should be. In this case, a 1MB-sized blockchain and a 20MB chain would form. All clients would have to reach a consensus on which chain to use, meaning that some transactions may end up on different chains, depending on the client used. The last hard fork occurred March 11, 2013 when a block (height 225430) was generated that was incompatible with previous versions of the Bitcoin software. The resolution was to revert back and try again with some parameters changed. In the end, some clients that hadn’t upgraded to version 0.8.1 by a certain date were effectively pushed off the network due to the majority of the network’s decision to follow the new pattern. In this case, a similar thing would happen in that people who didn’t upgrade by the set deadline would be unable to use the main chain.

The Bottom Line

Those are only a few of the notable advantages and disadvantages to the new proposed 20MB block size. In the end, we will eventually need to increase the block size if Bitcoin is to continue growing. How this is done, however, is up to debate. It can either be done instantly at a set time, like Gavin proposed, or gradually over time. Alternatively, as suggested by /u/cedivad on Reddit, the time between blocks could be reduced, also allowing for a higher transaction capacity. Regardless of the final resolution, these debates are important to ensure that Bitcoin can continue to evolve safely.

What is your stance on the new proposed block size? Leave a comment below and join the discussion!