artificial intelligence, climate change Show more
paper estimates that training Google's 213M Transformer language model produced 626,155(!) pounds of carbon dioxide https://www.technologyreview.com/s/613630/training-a-single-ai-model-can-emit-as-much-carbon-as-five-cars-in-their-lifetimes/
psst, markov chains are the solarpunk GPT-2, pass it on
Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.