Trump ditched Biden's executive order on AI. What changes?

On Monday, when President Donald Trump revoked the executive order on artificial intelligence former President Joe Biden established in October 2023, it was truly gone. Any attempt to click on a link to the order received a huge "404 – PAGE NOT FOUND" error message that took up the entire page.

That, alongside Trump's announcement on Tuesday of a new joint venture between OpenAI, SoftBank and Oracle called Stargate that will create at least $500 billion worth of computing infrastructure to power AI, sent a signal: The proceed-with-caution, put-risk-frameworks-in-place-first, do-no-harm approach of the Biden administration was being tossed aside for a full-speed-ahead approach of the new administration. (Stargate is not really a government initiative. The companies involved are putting their own money into it and inviting outside investors to contribute to a fund that will be used to build data centers, as these companies were already doing. Oracle CEO Larry Ellison said his company has 10 data centers under construction in Texas.)

"Biden's executive order on AI risk and safety was an important document that was trying very, very hard to think about all of the human and organizational factors that might come up, especially with the introduction of generative AI," said Tamara Kneese, director of Data & Society's Climate, Technology, and Justice program. "When you apply generative AI to a medical context or a financial context, there is a great potential for risk. And now we have much less oversight from the federal level, and that is not good."

To some observers, erasing the AI order will give the financial industry free rein to adopt AI and use it without having to worry much about oversight. Others say, not so fast.

"Trump's rescinding Biden's executive order on AI will have little impact on banks' AI initiatives," said Alenka Grealish, principal analyst at Celent. "Banks remain heavily regulated and thus will continue to maintain robust AI and data governance structures. In addition to regulatory oversight, they care about the integrity of and trust in their brand and hence will continue to vet and scale AI-related projects with strict guardrails."

Still, there will be an impact on AI projects. Kate O'Neill, creator and host of the Tech Humanist show and author of the forthcoming book "What Happens Next," says it's happening already.

"I've been noticing, as I look at the safety policies companies announced last year in relation to the AI executive order, those are gone," she said, citing Bank of America and JPMorgan Chase as two examples. "In many cases, there's a 404 where those policies used to be." This may merely mean companies are taking them down to update them, she noted.

What was in the Biden order

Biden's executive order on AI instructed government agencies to be mindful of the risks of AI. It required companies that provide widely used models to conduct safety tests and report the results of those tests. Companies that use or build their own AI models would have to provide protections against the risks of AI, including intellectual property theft, cybersecurity threats, data privacy infringement and bias and discrimination. The order aligned with the White House's blueprint for an AI bill of rights, which gave consumers five rights vis-a-vis AI: protection from unsafe or ineffective systems; no discrimination by algorithms; data privacy; notification when algorithmic systems are being used; the ability to opt out; and access to customer service provided by human beings.

Trump's revoking of the AI order was anticipated, according to Jennifer Everett, partner at Alston & Bird.

"It's a reflection on the change of perspective by which the administration is addressing AI," she said. "Whereas before, the focus was not to stifle innovation, but to do so in a meaningful way that takes into account issues like consumer harm, ethics, safety and the like. With the rescission of that executive order, now the theme is moving towards deregulation, at least at the federal level, of the movement of AI innovation, and taking the brakes off of the federal government."

A new AI regulation framework limits U.S. chip and software exports. Bank vendors are among the tech giants that say it's poorly timed and threatens innovation.

January 13
Biden Announces $5.9 Billion For Ukraine Amid Final Aid Push

This will put a pause on new AI guidance from bank regulators, Everett said. 

"This doesn't mean that the wheels are coming off and that there's no regulation at all," she said. "Highly regulated organizations, like in the financial sector, are still going to have industry standards with which they're going to have to comply."

States will likely fill some of this void, too, Everett said, citing comprehensive AI bills in Colorado and Utah. She also doesn't see banks tossing the AI risk frameworks they have set up. 

"AI is new," Everett said. "It offers new innovations, just as any new technology does. But that doesn't mean that it is such a unique technology that the deployment of a product or service goes without any considerations for compliance with the law. There are still going to be laws that apply." Existing rules on data privacy, cybersecurity, antitrust and consumer protection are still going to apply, she added. 

"The rescission of the executive order is a sign that this administration may be taking a different approach," Everett said. "That doesn't mean that we're left with a Wild West."

Business leaders, the Trump administration and Republicans are likely to position the deletion of the AI order as being good for innovation, O'Neill said. 

"I think that's a false dichotomy," she said. "I don't think that we necessarily need to see deregulation to see good innovation flourish."

Companies thrive when they are not playing simply to regulation, but are focused on what is going to make the most sustainable impact on their community and on their market, she said. 

Deregulation isn't necessarily a boon for innovation, O'Neill said. "It's not bad for innovation, but it isn't necessarily good for people. And what isn't good for people isn't good for innovation."

What banks should do now

Everett recommends that banks stay the course on their current efforts to set up AI governance committees and build on their understanding of newer AI technologies. 

"That's just good governance overall," she said. "And you want to have an understanding of what it's doing with your data, just as you would do vendor due diligence for any new product that you are deploying." 

Even with less regulation, there's still the potential for private litigation when an AI engine generates an error or a biased decision, as well as legal actions from state attorneys general.

The lack of AI guidance from the White House puts the burden on business leaders to do their own governance, O'Neill agreed.

"And they should make sure that they're making their own decisions," she said. "This just puts the onus squarely on business leaders and executives' shoulders. It also means that strong risk management practices are going to have to scale with these new computational capabilities."

For reprint and licensing requests for this article, click here.
Artificial intelligence Technology
MORE FROM AMERICAN BANKER