Bitcoin’s most influential developer has proposed a controversial fix that would help it handle more transactions.
In a test of Bitcoin’s ability to adapt to its own growing popularity, the Bitcoin community is facing a dilemma: how to change Bitcoin’s core software so that the growing volume of transactions doesn’t overwhelm the network. Some fear that the network, as it’s currently designed, could become overwhelmed as early as next year.
The answer will help determine the form Bitcoin’s network takes as it matures. But the loose-knit community of Bitcoin users is not in agreement over how it should proceed, and the nature of Bitcoin, a technology neither owned nor controlled by any one person or entity, could make the impending decision-making process challenging. At the very least it represents a cloud of uncertainty hanging over Bitcoin’s long-term future.
The technical problem, which most agree is solvable, is that Bitcoin’s network now has a fixed capacity for transactions. Before he or she disappeared, Bitcoin’s mysterious creator, Satoshi Nakamoto, limited the size of a “block,” or group of transactions, to one megabyte. The technology underlying Bitcoin works because a network of thousands of computers contribute the computational power needed to confirm every transaction and record them all in a permanent, publicly accessible ledger called the blockchain (see “What Bitcoin Is and Why It Matters”). Every 10 minutes, an operator of one of those computers wins the chance to add a new block to the chain and receives freshly minted bitcoins as a reward. That process is called mining.
Under the one-megabyte-per-block limit, the network can process only about three transactions per second. If Bitcoin becomes a mainstream payment system, or even a platform for all kinds of other online business besides payments (see “Why Bitcoin Could Be Much More Than a Currency”), it’s going to have to process a lot more. Visa, by comparison, says its network can process more than 24,000 transactions per second.
The developers in charge of maintaining Bitcoin’s core software have been aware of this impending problem for a while. Gavin Andresen, who has led work on Bitcoin’s core code since Nakamoto handed him the reins in 2010, told MIT Technology Review last August that his favored solution to the problem is to increase the maximum block size (see “The Man Who Really Built Bitcoin”). Earlier this month, Andresen got more specific, proposing that the maximum block size be increased to 20 megabytes starting in March 2016, calling this the “simplest possible set of changes that will work.” In a subsequent post on his blog, Andresen called the need for the change “urgent,” noting that the network would likely become unreliable if it were allowed to reach the current limit.
Mike Hearn, a former Google software engineer who has contributed to Bitcoin’s development, has calculated that at the current rate of transaction growth, the limit will be hit sometime in 2016. “Because upgrades take time, we need to prepare for this now,” Hearn writes in his own recent post on the issue.
The problem is that a consensus is required to make a change as consequential as the one Andresen suggests, which would substantially alter the requirements for mining. And not everyone in the community of users of the Bitcoin software—which includes miners, developers, and a growing number of startups—agrees that Andresen’s proposal is the best path forward.
A popular argument against the change is that it would favor bigger, richer mining operations that could afford the increased costs that go along with processing and storing bigger blocks. That could lead to a dangerous “centralization” within the mining community, says Arvind Narayanan, a professor of computer science at Princeton University (see “Rise of Powerful Mining Pools Forces Rethink of Bitcoin’s Design”). Another, more ideological argument is that Bitcoin was never supposed to change this drastically from Nakamoto’s original design. Some even argue that the limit doesn’t need to increase at all, as long as the developers make smaller adjustments to prevent the network from buckling when it reaches it—though that could make it more expensive to get transactions confirmed without delays.
The growing commercial ecosystem around Bitcoin is at stake. If the limit remains fixed, businesses hoping to store lots of transactions on the blockchain could be out of luck. And such interest is only increasing—earlier this month, the Nasdaq stock exchange said it was testing Bitcoin’s blockchain for transactions in its private market subsidiary. If the test is successful, the exchange says, it could use the technology for all Nasdaq trades in the public market.
Will Bitcoin be able to handle that? Pieter Wuille, another of Bitcoin’s five core developers, says right now there are just too many unknowns about the consequences of increasing the block size to 20 megabytes. In addition to significantly raising the cost of validating transactions, which could force out smaller players, he says, there may be “things we don’t even know of that could break.” Wuille is in favor of increasing the block size “in general,” but says a smaller increase at first would be less risky.
For now, the debate will continue to play out on the Bitcoin development mailing list, a forum that includes the core developers as well as the many others who contribute code.
Ultimately, though, the decision-making process “really comes down to how the core developers feel about it,” says Narayanan, since they are the only ones with the power to change the code. Complicating things even further is the fact that it’s not exactly clear how they would solicit input from all the stakeholders, many of whom may prefer to remain anonymous. The core developers could eventually find it necessary to take matters into their own hands.
At least one of them thinks that would be a bad idea, though. That would set an “incredibly dangerous precedent,” says Wuille.
No comments:
Post a Comment