$2.00
$1.75
Comments
  spent 10.0¢
This is what I was thinking and proposing while back
This doesn't change anything in incentives and doesn't break anything.... its a good idea, automating it would remove human factor in this completely which is a good thing as human can be corrupt(ed) entities only, and seems we are never in shortage of such people.
0.0¢
   4mo ago
10.0¢
  spent 10.0¢
its an OK proposal, could be better if the following unnecessary parts are removed:
  1. Aside from the largest of the above, the limit shall also have a static upper bound at 10,000,000,000,000 bytes (10TB).
  2. Median block size over the last 52,560 blocks (about one year) multiplied by 10 and calculated when a block is connected to the blockchain.

(also 3 months seems to be excessive. 2 weeks worked fine for difficulty)

0.0¢
   4mo ago
10.0¢
  earned 40.0¢
I question the benefit of these large timespans. This would entail calculating the median size in 52560 headers on each block verification as you can't "roll" medians. This isn't slow, but frankly feels like pointlessly wasted cycles and cache lines with little gain.
Wouldn't this work just as well if you just take the last two weeks?
50.0¢
   4mo ago
10.0¢ 25.0¢ 25.0¢
  earned 40.0¢
My opinion is that the limit should never be allowed to be reduced, only increased. Allowing it to be reduced is to accept the possibility of negative adoption, which makes no sense to me. I also dislike arbitrary things like the 32MB minimum (and the 10TB upper limit other people have mentioned).
Hence, my suggestion is to change #3 above from
"3. 32,000,000 bytes (32MB)."
to
"3. The previous (current) limit."
50.0¢
   4mo ago
10.0¢ 48.8¢
  spent 10.0¢
> Malicious miner tries to increase the limit by throwing a lot of junk into the block This is very easy detectable and by doing so the miner increase his orphan rate. If the miner have 5% of the network and fill up all his block to 32 MB that would not increase the limit THAT much. I think it's a good and safe proposal.
0.0¢
   4mo ago
10.0¢