artificial intelligence, climate change 

paper estimates that training Google's 213M Transformer language model produced 626,155(!) pounds of carbon dioxide technologyreview.com/s/613630/

Follow

psst, markov chains are the solarpunk GPT-2, pass it on

Sign in to participate in the conversation
Friend Camp

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.