MacDirectory Magazine

Dmitry Marin

MacDirectory magazine is the premiere creative lifestyle magazine for Apple enthusiasts featuring interviews, in-depth tech reviews, Apple news, insights, latest Apple patents, apps, market analysis, entertainment and more.

Issue link: https://digital.macdirectory.com/i/1500862

Contents of this Issue

Navigation

Page 86 of 189

Is generative AI bad for the environment? A computer scientist explains the carbon footprint of ChatGPT and its cousins Kate Saenko, Associate Professor of Computer Science, Boston University Special thanks to The Conversation for republishing permission Generative AI is the hot new technology behind chatbots and image generators. But how hot is it making the planet? As an AI researcher, I often worry about the energy costs of building artificial intelligence models. The more powerful the AI, the more energy it takes. What does the emergence of increasingly more powerful generative AI models mean for society’s future carbon footprint? “Generative” refers to the ability of an AI algorithm to produce complex data. The alternative is “discriminative” AI, which chooses between a fixed number of options and produces just a single number. An example of a discriminative output is choosing whether to approve a loan application. Generative AI can create much more complex outputs, such as a sentence, a paragraph, an image or even a short video. It has long been used in applications like smart speakers to generate audio responses, or in autocomplete to suggest a search query. However, it only recently gained the ability to generate humanlike language and realistic photos. Using more power than ever The exact energy cost of a single AI model is difficult to estimate, and includes the energy used to manufacture the computing equipment, create the model and use the model in production. In 2019, researchers found that creating a generative AI model called BERT with 110 million parameters consumed the energy of a round-trip transcontinental flight for one person. The number of parameters refers to the size of the model, with larger models generally being more skilled. Researchers estimated that creating the much larger GPT-3, which has 175 billion parameters, consumed 1,287 megawatt hours of electricity and generated 552 tons of carbon dioxide equivalent, the equivalent of 123 gasoline-powered passenger vehicles driven for one year. And that’s just for getting the model ready to launch, before any consumers start using it. Size is not the only predictor of carbon emissions. The open-access BLOOM model, developed by the BigScience project in France, is similar in size to GPT-3 but has a much lower carbon footprint, consuming 433 MWh of electricity in generating 30 tons of CO2eq. A study by Google found that for the same size, using a more efficient model architecture

Articles in this issue

Archives of this issue

view archives of MacDirectory Magazine - Dmitry Marin