A reader sends a link to an article on Ars Technica further documenting the energy costs of Bitcoin:
The bitcoin network is run by miners, computers that maintain the shared transaction ledger called the blockchain. A new study estimates that this process consumes at least 2.6GW of power—almost as much electric power as Ireland consumes. This figure could rise to 7.7GW before the end of 2018—accounting for almost half a percent of the world’s electricity consumption.
The study is an updated version of calculations performed late last year by analyst Alex de Vries. In this new version, de Vries has gathered more detailed information about the economics of the mining business. But his new numbers are broadly consistent with the old ones. Last December, he estimated that the bitcoin network was consuming roughly 32TWh annually, or 3.65GW. His website, which is updated daily, now shows the network consuming 67TWh annually, just under that upper bound of 7.7GW shown in his new study.
But even more interesting:
A crucial point here is that the difficulty of the mining task automatically adjusts to maintain a 10 minute average block creation rate. So if more computing power joins the network, the result isn’t that more bitcoins get created. Instead, it takes more computing power to produce each bitcoin, making existing mining hardware less profitable than before—and driving up the energy consumed per bitcoin.
I was not aware of this, and is a rare sighting of an anti-scalable algorithm, if I don’t miss my guess.