Home News These simple changes can make AI research much more energy efficient

These simple changes can make AI research much more energy efficient

86
0


Since publishing the first paper examining the technology’s environmental impact three years ago, researchers have begun to self-report energy consumption and emissions from their work. Having accurate numbers is an important step in making a difference, but actually collecting those numbers can be a challenge.

“You can’t improve what you can’t measure,” said Jesse Dodge, a research scientist at the Allen Institute for Artificial Intelligence in Seattle. “For us, if we want to make progress in reducing emissions, the first step is that we have to measure well.”

To that end, the Allen Institute recently partnered with Microsoft, artificial intelligence company Hugging Face, and three universities to create a tool that can measure the power usage of any machine learning program running on Microsoft’s cloud service, Azure. With it, Azure users building new models can see the total amount of power consumed by graphics processing units (GPUs) (computer chips designed to run computations in parallel) at each stage of the project, from selecting a model to training and using the model . It is the first major cloud provider to allow users to access energy impact information about its machine learning programs.

While tools already exist to measure the energy use and emissions of machine learning algorithms running on local servers, those tools don’t work when researchers use cloud services offered by companies like Microsoft, Amazon and Google. These services do not give users direct insight into the GPU, CPU, and memory resources their activities consume, and existing tools such as Carbontracker, Experiment Tracker, EnergyVis, and CodeCarbon require these values ​​to provide accurate estimates.

The new Azure tool, which debuted in October, now reports on energy usage, not emissions. So Dodge and other researchers figured out how to map energy use to emissions, and in late June presented a companion paper on the work at FAccT, a major computer science conference. The researchers used a service called Watttime to estimate emissions based on the zip codes of cloud servers running 11 machine learning models.

They found that emissions could be significantly reduced if researchers used servers in specific geographic locations and at specific times of day. Emissions from training small machine learning models can be reduced by up to 80% if training begins when more renewable electricity is available on the grid, while emissions from large machine learning models can be reduced by up to 80% if training is suspended when renewable energy is available Reduce power scarcity by over 20%, restart when it’s more abundant.



Source link

Previous articleLocarno Film Festival Unveils Lineup – The Hollywood Reporter
Next articleTop 10 Celebrity NFT Portfolios In 2022

LEAVE A REPLY

Please enter your comment!
Please enter your name here