By now, we have seen quite a few proposals for the block size increase. It's hard not to notice that there are potentially infinitely many functions for future block size increases. One could, for instance, double every N years for any rational number N (and there are infinitely many of those), one could increase linearly, one could double initially then increase linearly, one could ask the miners to vote on the size, one could couple the block size increase to halvings, etc. Without judging any of the proposals currently on the table, one can see that there are countless alternative proposals one could imagine creating. And the creative Bitcoin community is capable of generating all of them, with two new ones added into the mix within the preceding 24 hours.
It seems like it's time to nudge the community towards thinking one notch higher, about meta-goals. That is, can we enunciate what grand goals a truly perfect function would achieve? If we could look into the future and know all the improvements to come in network access technologies, see the expansion of the Bitcoin network across the globe, and precisely know the placement and provisioning of all future nodes, what metrics would we care about as we craft a function to fit what is to come?
To be clear, I want to avoid discussing any specific block size increase proposal. That's very much the tangible (non-meta) block size debate, and everyone has their opinion and their best good-faith attempt at what that function should look like. I've purposefully stayed out of that issue, because there are too many options and no metrics for evaluating the options.
Instead, I want to nudge the community into thinking about how to evaluate a good proposal. If we were looking at the best possible function, the perfect BIP that will be best for Bitcoin in the long term, how would we know? What characteristics would that BIP have? If we have N BIPs to choose from, what criteria do we look for while choosing among them?
Ideally, we would achieve rough consensus on quantitative, explicit meta-criteria, before we get into debating the specifics of possibly infinitely many proposals. To illustrate, some possible meta goals might be:
And it's quite OK, and probably likely, to have a combination of these kinds of metrics and constraints.
Of course, even with meta-goals in hand, there will be room for lots of disagreement because we do not actually know the future and reasonable people can disagree on how things will evolve. I think this is good because it makes it easier to agree on meta-goals than on an actual, specific function for increasing the block size.
For full disclosure, I personally do not have a horse in the block size debate, besides wanting to see Bitcoin evolve and get more widely adopted. I ask because I want to understand what the meta-criteria are in evaluating the many different proposals -- ideally, the decision would be made from first-principles, say, like a science discipline, and not through cacophonous discussion relying on personalities, say, unlike the UN or Congress. Also, as an academic who built a large emulation framework, I'd like to understand if we can use various simulation and analytic techniques to examine the proposals.
So, it looks like some specific meta-level criteria would help with evaluating better at this point than new proposals all exploring different variants of block size increase schedules. And I'd like to encourage the community to think about making these meta-criteria quantitative and explicit.
comments powered by Disqus