As cryptocurrencies get a lot of attention, this report in NewScientist (27 January 2018) certainly does nothing to encourage me to buy any bitcoins:
Decentralisation is key to cryptocurrencies, because there is no Federal Reserve or European Central Bank to lend legitimacy to the cause. Instead, decentralised networks authenticate transactions so no individual user has the power to manipulate the process, but everyone has the power to check it.
Emin Gün Sirer at Cornell University in New York and his colleagues monitored the bitcoin and ethereum networks from 2015 to 2017 to see how decentralisation was faring. “There is a lot of noise made about decentralisation, and then when you look at it, it’s not all that decentralised,” Sirer says. On top of this, bitcoin has halved in value since last month, with other cryptocurrencies having similar declines.
With bitcoin, the top four miners control more than half of the computational power of the bitcoin network, called the “hash share”. With ethereum, a well-established cryptocurrency that uses smart online contracts, more than 60 per cent of the computational power is controlled by only three miners. These may be individual miners or groups of people who share their processing power.
This is dangerous, because any person or group with a hash share of 51 per cent or more could potentially game the system by either censoring other users’ bitcoin transactions – making sure that they can’t send or receive currency – or by double-spending their own coins, according to Garrick Hileman at the University of Cambridge.
Or, as they point out, welcome back to central banking – with none of the governmental oversight.
Is this suggestive that we cannot replace that function of government, tainted as it is with human corruption potential, with the clean objectivity of algorithms? Well, honestly, I have no idea – one data point doesn’t make for an argument.
Except there is a second data point. There have been observations that certain AI applications are exhibiting sexist or racist behaviors when it comes to situations where humans might be sexist or racist as well. This is put down to the data used to train the artificial intelligence.
As ever, our constructs are vulnerable to corruption, as they are our inventions – which means we need to monitor them, just as we monitor ourselves.