AI Executive Order: Insights and Takeaways for Financial Services

Thinking specifically about the impact on the financial sector, we have noticed a few key takeaways that leaders at financial institutions should pay close attention to.

Matt McEachern
Download the AI Checklist

On October 30th, the Biden-Harris administration released a comprehensive 63-page executive order on the “safe, secure, and trustworthy development and use of artificial intelligence.” Since releasing a blueprint for an AI bill of rights last October, the White House and federal agencies in the United States have been moving towards AI guidelines and regulations. The National Institute of Standards & Technology (NIST) released its Artificial Intelligence Risk Management Framework (RMF) in January, and major tech companies—like Amazon, Google, and Microsoft—committed to AI safeguards this summer

With the release of this executive order, the White House is outlining specific AI risks and prescribing some regulations for agencies to comply with in the next 90 to 365 days. Thinking specifically about the impact on the financial sector, I have noticed a few key takeaways that leaders at financial institutions should pay attention to.

1. Financial Services was explicitly mentioned several times

The executive order called upon several different agencies to put forth different guidelines and best practices for financial services. By March 28, 2024, the Secretary of the Treasury must release a public report on best practices for financial institutions to manage AI-specific security risks. AI introduces a whole new realm of cybersecurity risks. Hackers and bad actors can use AI as a weapon to infiltrate systems, such as using generative AI in social engineering to emulate someone’s voice or create very personalized phishing campaigns. Financial institutions will need to consider different strategies, such as updating cybersecurity practices or enhancing training for employees, to protect against these types of risks.

By April 27, 2024, the Secretary of Homeland Security must incorporate NIST’s AI RMF and other security guidance into relevant safety and security guidelines for critical infrastructure owners and operators. Critical infrastructure as defined in the Patriot Act includes financial services. After these guidelines are completed, agencies have 240 days to help the Federal Government mandate them and enforce them through regulatory or other appropriate action. Finally, independent regulatory agencies are encouraged to consider mandating guidance themselves. 

2. NIST will develop a companion resource for generative AI

NIST has long been at the forefront of evolving regulation. As part of the executive order, NIST has been called upon to complete very specific activities. By July 26, 2024, NIST must:

  • Generate a companion resource to the AI RMF specific for generative AI
  • Publish guidelines and benchmarks to audit AI capabilities
  • Put together resources for secure development of generative AI
  • Deliver guidelines for AI red-teaming tests. Red-teaming describes the actions of creating an environment where developers can safely hack systems to expose vulnerabilities. 

3. Developers of foundation models are receiving special scrutiny 

Foundation model providers, such as OpenAI, are receiving special scrutiny. The government recognizes that many companies leverage these foundation models to create specialized AI applications. By providing further guidelines and regulations to these models, the government is essentially working towards protections and regulations across all these specialized AI applications through the root of the foundation model. 

4. Agencies are discouraged from implementing generative AI bans or blocks 

Finally, one of the most interesting quotes in the entire document that could directly affect banks is: “Agencies are discouraged from imposing broad general bans or blocks on agency use of generative AI… With appropriate safeguards in place, [agencies should] provide their personnel and programs with access to secure and reliable generative AI capabilities.” The Federal Government is implying that agencies that are avoiding AI are making the wrong choice and should instead look for ways to experiment or implement AI for “routine tasks that carry a low risk of impacting Americans’ rights.” I am very encouraged by this. Not only is the Federal Government leaning in to protect citizens’ rights, but also recognizing that systems are enabling efficiencies and providing real value. They aren’t simply toys or a new “fad.” 

What FIs should do now in response

While the executive order itself serves more like a plan or roadmap for a variety of agencies, regulation will be fast approaching in the spring and summer of 2024. While agencies and creators of AI models will be directly impacted by these guidelines and regulations, financial institutions themselves need to also prepare for upcoming regulation. To avoid getting caught off guard, financial institutions should take two key actions at the close of 2023 and into the start of 2024:

  1. Familiarize yourself with NIST’s guidelines. If you haven’t taken a look at the current AI RMF (NIST AI 100-1), you should do so by the end of this year so you’re prepared for the companion piece in the summer. Even though these frameworks are voluntary, it’s important to consider best practices and guidance from the premier thought leaders in the industry. It’s also important to start getting comfortable using AI. It won’t be long before AI applications are table stakes for any company, including banking
  2. Connect with vendor partners who will respond to regulatory changes. Even if you aren’t responsible for following upcoming regulation, it is your responsibility to ensure decisions you make about AI will be able to flex for the future’s regulatory environment. The right vendor partners will keep up with regulations and navigate the future with you. If you have partners with AI providers, start having conversations with them now about how they’re responding to the executive order and their plans for following best practices and guidelines released by NIST and other agencies.

Posh will continue to watch for releases on best practices, guidelines, and regulations from federal agencies and provide insights and takeaways for the financial services industry. If you have specific questions about the new executive order and what it means for you and your business, reach out to Posh.

Relevant CTA

CTA based on this blog post

Free-form text goes here that makes sense

CTA Goes Here

Blogs recommended for you

How Posh Secures Critical Documentation in Knowledge Assistant
December 5, 2023

How Posh Secures Critical Documentation in Knowledge Assistant

Read More
How Posh Secures Critical Documentation in Knowledge Assistant
November 29, 2023

3 Fundamental Practices of Responsible AI

Read More
3 Fundamental Practices of Responsible AI
October 31, 2023

How Posh Mitigates Risk for Our Banking Partners

Read More
How Posh Mitigates Risk for Our Banking Partners
Event -

AI Executive Order: Insights and Takeaways for Financial Services

Are you attending and interested in learning more?

Register today
Visit event page to learn more

Email info@posh.ai for the recording!

December 1, 2023
5:28 pm
Virtual event

Event Details

Speakers

No items found.

Event Details

Come chat with us at  

No items found.

Upcoming Webinars and Events

No items found.