Categories: Business

The Four Ms, a plan Google is proposing to AI companies to use a thousand times less electricity.

  • The energy consumption of AI as it learns and responds is becoming an increasingly alarming monster.

  • Google proposes a four-pronged strategy to solve the problem: model, machine, mechanization, and mapping.

Concerns are growing about energy demand, water consumption and the carbon footprint of artificial intelligence. This is not a silo disaster, but a reality that is putting increasing pressure on the power grid and has prompted the International Energy Agency to convene a global summit. Google proposes a four-pronged strategy to address it.

Four M’s. In a study published by IEEE, Google identifies four methods it calls the “4Ms” through which it claims large AI companies can reduce the carbon emissions of their machine learning algorithms by 100 to 1,000 times:

  1. Model: Use more efficient machine learning architectures to reduce compute requirements by 3-10x.
  2. Machine: Use specialized AI equipment to increase its efficiency by 2 and 5 times.
  3. Mechanization: Choose cloud computing over local computing to reduce energy consumption by 1.4-2 times.
  4. Mapping: Optimize data center locations based on available clean energy to reduce emissions by 5-10x.

David Patterson, a Google Research researcher and lead author of the study, says the carbon footprint associated with training AI is more likely to be reduced than increased by following these four practices.


M for model. At the architectural level, new AI models increasingly incorporate advances aimed at increasing their efficiency. Google, Microsoft, OpenAI, or Meta use a technique called “knowledge distillation” to train smaller models that mimic a larger, “master” model that requires less energy.

They continue to train larger and larger models, many of which are not available to users, but in Google’s case, training these models accounts for 40% of the energy consumption, while the “inference” of the models available to users (processing responses) accounts for 60%.

While it may seem counterintuitive, the latest multimodal models released to the public, like Gemini 1.5 Pro and GPT-4o, are also more effective than their predecessors due to their ability to use different input methods, such as images and code: they learn with less data and examples than text-only models.

Join our Aerothermal Group Purchase and save up to 70%. At Comunidad Solar we offer you the best air conditioning technology for your home at reduced prices, with flexible financing and tax incentives. Take part and transform your home!

Advice from the brand

M is for car. The vast majority of AI companies buy their hardware from Nvidia, which has specialized chips. But more and more companies are choosing the “Google model” to develop their own hardware, including Microsoft, OpenAI, and, in China, Huawei.

Google has been using its own “TPUs” (tensor processing units, which specialize in AI) for years. The latest generation, called Trillium, was announced in May and is 67% more energy efficient than the previous one, meaning it can do more calculations with less power, both in training and tuning, and in AI inference in Google’s data centers.

M for mechanization. Another counterintuitive idea: Cloud computing uses less energy than computing in an on-premises data center. Cloud data centers, especially those designed for artificial intelligence, contain tens of thousands more servers than an organization’s data centers and are equipped with better power distribution and cooling systems because they can pay for themselves.

Aside from the disadvantage of handing over data to large cloud companies like Amazon, Microsoft, or Google, cloud data centers have another distinct advantage: they are more modern, meaning they have machines more specialized in AI training and inference.

M for cartography. Another reason Google is calling for more cloud computing and less on-premises computing is the companies’ commitment to renewable energy. Some of these large data centers already run on 90% carbon-free energy.

Big tech companies are locating their new data centers in places where there are plenty of renewable resources, including the water used to cool servers, and this has led companies like Google, Microsoft and Apple to use 100% of the electricity in their operations from renewable energy sources and aim for zero net emissions by the end of this decade.

On the other hand, companies like Microsoft and OpenAI are unsure whether renewable energy supplies will be able to meet growing energy demand and are already looking to expand nuclear capacity either through small modular reactors or by investing in fusion research.

Image | Google Cloud

In Hatake | The Electric Grid Is Suffering, Not Because of Electric Cars, But Because of the Massive Demand for Artificial Intelligence

Source link

Admin

Share
Published by
Admin

Recent Posts

Anne Hathaway Reveals Her Strange Life-Changing Power

On TikTok, Anne Hathaway is a raving commentator on levras and pulps within seconds. Discover…

35 mins ago

‘Doctor. Death’ loses Switzerland’s permission to test his ‘Sarco’ capsule for suicide

Controversial activist in favor of suicide and euthanasia Philip Nitschke, Popularly known as 'Dr. Death',…

37 mins ago

Nanoparticles That Fight Giant Diseases

Nanoparticles are very small materials, on the order of nanometers (a million times smaller than…

38 mins ago

This is the most anticipated SUV of 2024, a car with up to 7 seats and an ECOthat can be ours for 24,300 euros.

Until recently, seven seats were offered by only two types of vehicles: vans converted into…

41 mins ago

Free Zenless Zone Zero Codes and How to Redeem Them on PS5, PC, iOS and Android

On July 4, HoYoverse will be updated with a new trendy gacha from the creators…

46 mins ago

UEFA sanctions Merih Demiral for gesture celebrating his second goal against Austria

ANDTurkish Defense Merih Demiral suspended by UEFA for next two matches championship, BILD reports. The…

47 mins ago