Joe Biden’s AI Executive Order
The United States President Joe Biden issued a lengthy executive order with six new standards for AI safety and security, aligning with the principles of safety, security, trust, and openness.
My Executive Order on AI is a testament to what we stand for:
Safety, security, and trust. pic.twitter.com/rmBUQoheKp
— President Biden (@POTUS) October 31, 2023
The order mandates sharing results of safety tests and accelerating the development of privacy-preserving techniques. However, the lack of details has left many in the industry wondering how it could potentially stifle companies from developing top-tier models.
“This is certainly challenging for companies and developers, particularly in the open-source community, where the executive order was less directive.”
The administration’s intentions to manage the guidelines through chiefs of AI and AI governance boards in specific regulatory agencies means that companies building models within those agencies should have a “tight understanding of regulatory frameworks” from that agency.
The government has released over 700 use cases as to how it is using AI internally via its ‘ai.gov’ website. Martin Casado posted on X that he sent a letter to the Biden Administration over its potential for restricting open source AI.
1/ We’ve submitted a letter to President Biden regarding the AI Executive Order and its potential for restricting open source AI. We believe strongly that open source is the only way to keep software safe and free from monopoly. Please help amplify. pic.twitter.com/Mbhu35lWvt
— martin_casado (@martin_casado) November 3, 2023
Industry Concerns
The letter called the executive order “overly broad” in its definition of certain AI model types and expressed fears of smaller companies getting tangled up in the requirements necessary for other, larger companies.
Jeff Amico, the head of operations at Gensyn AI, also posted a similar sentiment, calling it “terrible” for innovation in the U.S.
Biden’s AI Executive Order is out and it’s terrible for US innovation.
Here are some of the new obligations, which only large incumbents will be able to comply with pic.twitter.com/R3Mum6NCq5
— Jeff Amico (@_jamico) October 31, 2023
Regulatory Clarity Concerns
Regulatory clarity can be “helpful for companies that are building AI-first products,” but it’s important to ensure that the regulatory guidelines aren’t overly favorable to just the largest companies in the world.
“How these regulatory frameworks are implemented now depends on regulators’ interpretations and actions,” he said.
Potential Impact of AI Regulations
Fears about AI’s “apocalyptic” potential are “overblown relative to its prospects for near-term positive impact.” New autonomous process controls driven by AI are significantly improving yields and reducing waste and emissions across industries like advanced manufacturing, biotech, and energy.
“Simply put, AI is far more likely to benefit us than destroy us.”
Hot Take: The Future of AI Innovation Under Biden’s Executive Order
While concerns exist about how Biden’s executive order could stifle innovation and restrict open source AI development, there are also hopes for near-term positive impacts across various industries with improved yields and reduced waste due to advances in autonomous process controls driven by AI.
The focus should be on ensuring regulatory guidelines don’t favor only large companies while allowing space for earlier stage companies’ interests within conversations between the government and private sector.