OPEN LETTER TO BITCOIN MINERS AND CORE DEVELOPERS
bitcoin·@squirrellywrath·
0.000 HBDOPEN LETTER TO BITCOIN MINERS AND CORE DEVELOPERS
https://www.cryptocoinsnews.com/wp-content/uploads/2014/05/bitcoin-mining.jpg First, lemme say kudos to everyone in the Bitcoin community, from miners to client-facing exchanges to end-users and sound money advocates. Bitcoin represents freedom from government oppression. Considering that last sentence, one would correctly assume that I fall on the purist ideological side. That said, and without being an engineer or a coder, I wish to chime in on the block size debate. As those interested know, the blocksize debate centers around the trade off between the protections of decentralization versus the need for scalability. This problem is a good one in the sense that Bitcoin is getting so popular, that the network is jamming up. http://www.bitcoinisle.com/wp-content/uploads/2016/05/725_aHR0cDovL2NvaW50ZWxlZ3JhcGguY29tL3N0b3JhZ2UvdXBsb2Fkcy92aWV3L2JlMTc5NjIyMmE3ZTU1ODMxNzAzMjgyMzgzNTllMjlhLmpwZWc.jpeg I have never been a fan of Garvin Andresen (he seems so corporate to me). His first proposal for 8MB block size proposal (versus the 1MB current size) seemed rather arbitrary and extreme. Actually, I think he first proposed a 20MB block size. Really, wouldn't it make more sense to be 2x, 4x, 8x, or 16x, etc? Whatever. Then there was the whole "convinced" of Craig Wright's claim of being Satoshi Nakamoto. In a word: WOW, you screwed up, bro. http://www.inchcalculator.com/wp-content/uploads/2015/03/concrete-block-dimensions.png All this said, would doubling the block size to 2MB be such a compromise of principle? Certainly not anywhere near the compromise of principle those chaps at Ethereum attempted with a hard-fork rollback. As I understand it, the blocksize cannot forever be increased due to centralization of the miners (barriers to entry increase with block size). But there is quite a bit of room, it seems. http://www.carlsterner.com/research/images/2009_resilience/baran1964.jpg If we can accept some centralization so long as it is strictly defined so that centralization concerns are addressed, Bitcoin could have plenty of room to run. Although I am not a coder, I have coded a little as a hobby. I understand the inelegance of hard-forking, and worse still, hard forking to implement a change of a *constant* value. My simple idea: replace the constant block size value with a variable. Is that nuts? It seems so obvious to me. But what variable to use such that the appropriate value (e.g., 1MB, 2MB, etc) is plugged in? http://www.jasonsummers.org/wp-content/uploads/2013/01/Federal-Tax-Revenues-vs.-Government-Spending.png I cannot help but think of a solution to the U.S. Federal Budget (a disaster to say the least). Libertarians, Minarchists, Anarchists, etc., of course, believe that the budget should be wound down to zero. I agree. But the futility of convincing the public of such a goal as well as how to reach that goal is prohibitive. A *reasonable* solution would be to base the current year's budget on the previous year's tax revenue. In other words, however many trillions collected in a given year would be the sole input for determining the budget in the following year -- the result being a roughly balanced budget and an end to unsustainable debt. https://blog.coinbase.com/content/images/2015/12/Bitcoin-transactions-on-the-blockchain.jpg Could we not implement a similar solution for determining block size? Rather than have a block size coded as a constant, developers could code a simple formula that uses, say, most recent 6-month transaction volume as its sole input. When transaction volume increases (as the network grows), the block size would *automatically* double at the appropriate time. Simple, elegant, and unbiased, no? I hope the miners have a good meeting. It seems that the whole community is stressing out over ETH/ETC and the latest exchange hack. Let's get it together finally!