Someone’s measuring the cost of training a computer via machine learning, as reported by NewScientist (15 June 2019):
Training artificial intelligence is an energy intensive process. New estimates suggest that the carbon footprint of training a single AI is as much as 284 tonnes of carbon dioxide equivalent – five times the lifetime emissions of an average car.
Emma Strubell at the University of Massachusetts Amherst in the US and colleagues have assessed the energy consumption required to train four large neural networks, a type of AI used for processing language.
Language-processing AIs underpin the algorithms that power Google Translate as well as OpenAI’s GPT-2 text generator, which can convincingly pen fake news articles when given a few lines of text.
These AIs are trained via deep learning, which involves processing vasts amounts of data. “In order to learn something as complex as language, the models have to be large,” says Strubell.
Well, that’s lovely to know, but why pick the car as your comparison? The point of comparisons is to compare like things, which is to say things that belong to the same functional category.
So compare to a human being. How much does it cost to bring a human from birth to the same level of capability as the computer, discounted for the fact that the human is, in most cases, multi-capable, unlike the computer. Another factor is the ease of replicating that ability from computer to computer, without the learning portion, while each human lacks that all important USB port in their head.
I have no idea how to answer the question. I just keep in mind that using computers to resolve problems within the manual capability of humans would seem to be a waste of energy and climate.
And, yet, who laments the disappearance of the great thundering herds of filing clerks? The entire question of which class of problems deserves the application of computers is a little nuanced than one might think.