Artificial intelligence uses more energy than other forms of computing. Training a single model consumes more electricity than about 130 U.S. homes use in an entire year,
"These large language models are so power hungry, and there's just not enough energy right now," Amazon CEO Andy Jassy said at the World Economic Forum in Davos, Switzerland, in January. "So we're going to have the dual challenge as a group to find a lot more energy to satisfy what people want to do and what we can get done for society with generative AI. But we've got to do it in a renewable way, in a carbon-neutral or zero way. It can't be going back to coal."
If this worries bankers, they're not talking about it, though many use traditional AI and some are experimenting with large language models. All the largest U.S. banks declined to comment on this topic for this story. Some smaller and regional banks also declined or didn't respond to requests.
Ben Wallace, a partner at Summit Technology Consulting Group, hasn't heard any bank executives agonize over AI's energy consumption because the regional community banks his firm works with are in the very early stages of using the cutting-edge technology, he said. Also, they view this as not their problem but the responsibility of the cloud providers that host their enterprise AI models, like Microsoft, Google and Amazon.
"These hyperscalers are massive consumers of energy anyway," Wallace said in an interview. "So their footprint goes from X to Y, but X was really big before. I haven't seen a lot of concern, at least from the banking world that we serve, about their carbon footprint because everyone's still evaluating how to make it practical." These bankers are far more concerned about model governance and feeding models accurate data than they are worried about AI's energy consumption, he said.
But some say the financial industry should be worried about the broader impact of their AI adoption.
"My view on this is that we should be concerned now, and this issue will only get worse," said Ken LaRoe, founder and CEO of Climate First Bank in Eustis, Florida. "In a few years all businesses will adopt some type of AI, resulting in an exponential increase in power consumption worldwide."
Arguably, energy consumption will become a factor in AI technology decisions in the future. Not only could increased energy use conflict with a company's sustainability goals, but cloud and AI technology providers are likely to pass along the costs of running energy-hogging models. Such price increases could cut across the very reason many banks are pursuing advanced AI in the first place: to cut costs and do more with less.
"The commercial realities will probably drive banks to evaluate how they leverage this technology," Wallace said.
Scope of the problem
It is hard to measure precisely how much energy AI consumes because the companies that know those numbers keep them close, noted Steve Rubinow, a professor in the Department of Information Technology and Management, College of Computing, at the Illinois Institute of Technology. The problem is not just carbon emissions but the vast quantities of water that are used to cool the servers used to power advanced AI models, observed Rubinow, who was also chief information officer of the New York Stock Exchange.
The complexity of the model and the amount of data it needs to crunch are also factors.
"Banks, financial services institutions, everyone who's using AI is trying to figure out the cost benefit for their particular situation and their opportunity," Rubinow said in an interview. "And in those costs are energy costs, whether they're direct or indirect, because even if they don't get an energy bill, they are paying for the energy by virtue of the supplier of the model."
Large language models are often run on high-performance Nvidia graphics processor units.
"It takes lots of chips and lots of time and lots of energy," Rubinow said. "What's the total cost of that and what is the offsetting benefit? I think people are still trying to figure that out."
Microsoft and Google didn't respond to requests for interviews. A representative for Amazon Web Services said studies by 451 Research, part of S&P Global, show that AWS's infrastructure is more efficient than performing those same workloads in on-premises data centers.
While training large language models like OpenAI's GPT-4 takes energy and can be very expensive, once the training is done, the models don't require as much horsepower, according to Wallace.
These models will become commoditized over time, Wallace said. When companies like banks want to use trained models, they will be able to use less expensive, faster, optimized models like xAI's Grok chatbot, which could cost about one-tenth as much as a large language model from a major provider like OpenAI or Meta, according to Wallace.
At Apple, which is on a mission to become carbon neutral, researchers say they can deploy large language models on iPhones and other Apple devices with limited memory with the use of flash memory.
"I think their play is going to be, we'll push the LLM down to your phone, which you're already operating," Wallace said. "So the carbon footprint is arguably neutral."
What banks can do
"Banks are hogs not just in the power they use for their own operation but in their lending and deposit portfolios," said LaRoe. "If the big banks would put a tiny amount of their budget behind it, they could solve the problem for all banks. This should absolutely be something that is open sourced with all financial institutions."
LaRoe would also like to see companies be more transparent about, and accountable for, their carbon emissions. The $540 million-asset Climate First Bank is currently deeply immersed in figuring out its total Scope 1, 2 and 3 emissions as defined by
"It's not easy," he said.
LaRoe hopes increasing adoption of AI will force organizations to invest more in alternative green energy innovation to not only enable AI deployment at scale, but also make renewable energy affordable and easily accessible.
UBS has a team working to finance renewable energy storage facilities that make solar- and wind-energy generation more feasible. The bank sees AI and overall increasing data usage driving up demand for more energy, including renewable energy.
"Because the wind doesn't blow a hundred percent of the time and the sun shines eight to 16 hours a day depending on where you are, there's a big role for energy storage to help in those renewable goals of the data centers and to provide energy that's renewable in the times that other renewables are not available," Ken-Ichi Hino, portfolio manager for energy storage at UBS, said in an interview. "We want to be at the front end of deploying energy storage economically, understanding that transition and capturing some of the early opportunities associated with that."
UBS is working with insurer USQRisk and Ascend Analytics, which jointly launched a product that provides revenue protections to institutions looking to invest in utility-scale energy storage projects.
Companies may need to rethink the size of the AI models they use, Rubinow said.
"Why do I need to have the works of Shakespeare and Chaucer in my generative AI model to write an equity analyst report?" he said. "I probably don't." On the other hand, some models trained on only proprietary data, which are less expensive to develop and run, haven't performed as well as models like OpenAI's GPT-4 that have been trained on most of the information on the internet.
"So maybe you do need a lot of Shakespeare in your data stores, I don't know," Rubinow said. "All that information, all that content that it's absorbed has certainly given it something that ingesting some collection of equity research reports alone is not going to."