This article is part of the Let’s talk spec series. Please read the introduction if you haven’t.
Let us start the with a topic that everyone is familiar with. The maximum block size limit. Everyone has an opinion on this one, so this should be exciting!
A new proposal is being written by a person that goes by the name of im_uname called Asymmetric Moving Maxblocksize Based On Median Block Size. Quite the title, but the concept is simple.
The basic idea is that we look at recent block history. We calculate recent usage and use it to set a new limit that is much higher. We adjust every time a new block is found. With this the blocks are unlikely to ever go full, even at load peaks.
At the same time we get to have a sanity check in place. If someone tries to send us a 1GB block, while all recent blocks have been 5MB, we know it is garbage and need not waste resources downloading or validating it.
At the heart of this proposal, as drafted, is this rule:
We propose a dynamic limit to the block size based on the the largest number out of the following:
1. Median block size over the last 12,960 blocks (about three months) multiplied by 10 and calculated when a block is connected to the blockchain.
2. Median block size over the last 52,560 blocks (about one year) multiplied by 10 and calculated when a block is connected to the blockchain.
3. 32,000,000 bytes (32MB).
A similar proposal was written, and implemented by BitPay in the midst of the BTC scaling debate. Of course it had no chance back then, given that both block size increase and protocol upgrade by hard fork in general was not on the table in that camp. But now may be a good time to revisit this idea.
So what are the attack scenarios?
Malicious miner tries to increase the limit by throwing a lot of junk into the block. It theoretically costs him nothing, because his transaction fee cost is zero. To have any effect on the limit, he has to mine lots of these junk blocks. In practice, the block will propagate slowly and he may loose blocks in an orphan race. Propagation techniques like graphene, xthin and compact blocks will not work, these require other nodes to have seen the transactions before the block is mined.
There is still a limit! An evil actor is still able to fill the blocks and temporarily disrupt the network. The disruption will be in the form of having other peoples transactions not confirm timely. This will be very costly for this actor in terms of fees, and increasingly costly as the network adapts to a higher limit. It is not an sustainable attack.
The limit may become too high. This can be mitigated by miners choosing to mine smaller blocks than the actual hard limit.
I will weigh in my opinion on this one. I will support any decent dynamic block size proposal. This is a decent proposal. It is better than what we have, a static, but human configurable limit.
There is the human side of it too. The best way to agree on a limit, is not to have to agree. Let the system adjust. Keep the humans out of it. We can focus on other fun things.
This change does not need to be permanent. Removing it is a simple soft fork, where we set the static limit back to 32MB.
Finally;
For finer details, here is the specification. The earlier draft by BitPay is also a good read and has a FAQ covering among other things thoughts about miner incentives.
Does this break the economics of Bitcoin? Is it preferable that we manually adjust this limit? Should we aim for this change for May?
 

$2.00
$1.75
Comments
  spent 10.0¢
This is what I was thinking and proposing while back
This doesn't change anything in incentives and doesn't break anything.... its a good idea, automating it would remove human factor in this completely which is a good thing as human can be corrupt(ed) entities only, and seems we are never in shortage of such people.
0.0¢
   1wk ago
10.0¢
  spent 10.0¢
its an OK proposal, could be better if the following unnecessary parts are removed:
  1. Aside from the largest of the above, the limit shall also have a static upper bound at 10,000,000,000,000 bytes (10TB).
  2. Median block size over the last 52,560 blocks (about one year) multiplied by 10 and calculated when a block is connected to the blockchain.

(also 3 months seems to be excessive. 2 weeks worked fine for difficulty)

0.0¢
   1wk ago
10.0¢
  earned 40.0¢
I question the benefit of these large timespans. This would entail calculating the median size in 52560 headers on each block verification as you can't "roll" medians. This isn't slow, but frankly feels like pointlessly wasted cycles and cache lines with little gain.
Wouldn't this work just as well if you just take the last two weeks?
50.0¢
   1wk ago
10.0¢ 25.0¢ 25.0¢
  earned 40.0¢
My opinion is that the limit should never be allowed to be reduced, only increased. Allowing it to be reduced is to accept the possibility of negative adoption, which makes no sense to me. I also dislike arbitrary things like the 32MB minimum (and the 10TB upper limit other people have mentioned).
Hence, my suggestion is to change #3 above from
"3. 32,000,000 bytes (32MB)."
to
"3. The previous (current) limit."
50.0¢
   1wk ago
10.0¢ 48.8¢
  spent 10.0¢
> Malicious miner tries to increase the limit by throwing a lot of junk into the block This is very easy detectable and by doing so the miner increase his orphan rate. If the miner have 5% of the network and fill up all his block to 32 MB that would not increase the limit THAT much. I think it's a good and safe proposal.
0.0¢
   1wk ago
10.0¢