Even worse, when training neural networks used advanced technology "search neural architecture" (NAS). It allows without too much difficulty, just by trial and error, to automate the design process of the neural network. This process is very time consuming – the same Transformer initially spends 84 hours on learning a new language, but with NAS it takes as many as 270 000 hours.
And that's just the tip of the iceberg — the calculations were performed for specific, well-known neural networks. But how in fact is much larger Google cloud platform and Amazon, which have energy capacities on which they are based is an open question. But what is already known, raises serious concerns. The development of AI should not be a new source of threat to the ecology of our planet.