At this point, we’re all familiar with artificial intelligence and the potential issues with overreach, privacy, plagiarism, misinformation, and potential loss of work for actual humans. Not to mention just the general ick factor of it all.
But you may not be aware that AI could potentially cause energy consumption to skyrocket so much that existing power grids can’t keep up. For example, just one single training run for an AI engine like Bard or ChatGPT consumes as much power as 120 consume households in an entire year. One of these AI companies can require more power than an entire city like San Francisco just to train its engines. The current GPUs and CPUs are designed for gaming, not AI. For AI, you need hundreds of servers running in parallel, which is a big challenge.
New architecture is being developed, but the current infrastructure is struggling to keep up with demand.
Is AI using stretching data centers to their limits?
I recently spoke with Bill Haskell, CEO of Innventure, a platform that invents and builds companies. Recently, Innventure has been working with a company in Austin, Texas that provides cooling for data centers. He shared with me the following:
- Energy from data centers consumes ~ 3% of the global power grid.
- Cooling represents 40% of the total power requirement which is ~ 1.2% of the global power grid.
- A single training run from an AI engine consumes power equivalent to that utilized by 120 average households for a year.
- Processors have historically grown at a 6-7% CAGR – some are forecasting a growth to 15% CAGR due to AI utilization.
- Processing power is not the only bottleneck. Network bandwidth required to transfer data from one processor to another is an additional constraint.
- Current CPU/GPU architecture is not optimized for AI algorithms. More parallel computing is required and may include up to 100 processors working together.
- AI computing demand is doubling every 3.4 months, outstripping Moore’s Law.
The reason AI engines require so much training (and therefore power) is that they don’t have contextual abilities that humans do. The example that Bill Haskell shared with me: if you see one side of a cat, you know that the other side of the cat will look pretty similar. But an algorithm lacks this ability and will need to see thousands of cat pictures to decide what the other side should look like.
AI is getting better and better at this, and will someday gain that contextual element. But right now, training AI is an extremely power intensive process. Manufacturers are scrambling to produce faster and faster chips. The faster the chips, the hotter the chips, and more cooling is required. Cooling is 40% of the entire energy expenditure of a data center. According to Haskell, we are reaching the thermal wall, or the limit beyond which air conditioning can cool the chips. The world has moved to liquid cooling, which brings its own issues as it requires the use of a lot of water.
Is there a better way to manage or offset AI power consumption?
I also touched base with Thomas G. Dietterich, Distinguished Professor, School of Electrical Engineering and Computer Science at Oregon State University, and he was a little bit more optimistic on AI technology’s impact on the future of energy consumption.
“There has been a steady flow of new developments in low-precision computation for deep learning, improved data selection, efficient fine tuning algorithms, and so on,” he explains.
“The power efficiency of specialized neural computation chips is also rapidly improving. Finally, moving AI processing into data centers is helping reduce the carbon footprint of AI because the data centers are operated extremely efficiently and many of them use green energy sources. The big data center operators are locating new data centers in areas with large green power resources.
“I’m optimistic that we will find ways to gain multiple orders of magnitude in reduced power consumption for current loads, and it is within our reach to achieve zero carbon data centers. I also want to raise the issue of whether we should continue to have a ‘shortage mindset’. Advances in green power technologies may give us an economy in which power is much cheaper and more plentiful than it is today. We should work for a world of energy abundance.”
He goes on to suggest that perhaps technology companies could raise people’s awareness by including a “personal carbon footprint” (PCF) display when people use these tools. Professor Dietterich asserts, “A key bottleneck in making the transition to green power is the lack of long-distance transmission lines. Building these and expanding green power infrastructure is a much more important factor than AI power consumption in managing future climate.”
“I do think that now is the time to start raising awareness and being conscious of how our increased use of AI is impacting the environment. While it may be possible to offset this massive jump in power needed to fuel AI engines, we need to start working on greener solutions sooner rather than later.”
How will Apple respond to the increased power demand?
Apple is known for greener solutions, and in fact, has formally committed to be 100% carbon neutral for its supply chain and products by 2030. I expect that Apple will incorporate more and more AI into its software in the years to come, so Apple will need to take that increased energy demand into account when fulfilling this promise.
Whether Apple keeps this promise, and whether other tech giants get on board, remains to be seen. But given Apple’s history, I’m hopeful that Apple will rise to the challenge and set a positive example for other technology companies to follow suit.