• Home
  • Latest
  • Fortune 500
  • Finance
  • Tech
  • Leadership
  • Lifestyle
  • Rankings
  • Multimedia
NewslettersEye on AI

The atom bomb or the airliner? The battle over A.I. analogies will determine the fate of A.I. regulation

Jeremy Kahn
By
Jeremy Kahn
Jeremy Kahn
Editor, AI
Down Arrow Button Icon
Jeremy Kahn
By
Jeremy Kahn
Jeremy Kahn
Editor, AI
Down Arrow Button Icon
June 30, 2023, 11:54 AM ET
Photo of the mushroom cloud from an atomic bomb exploding.
Some of those calling for A.I. regulation liken the technology to nuclear weapons. But Microsoft President Brad Smith says he thinks commercial aviation may be a better analogy.Getty Images

Hello and welcome to June’s special edition of Fortune’s Eye on A.I.

Political fights are won and lost over narratives. Tell a better story than your opponents and you’re likely to win. Framing is everything—which means picking the right metaphors or analogies can be crucial. So it was fascinating to watch Microsoft policy chief Brad Smith trial various ways of framing the A.I. regulation debate while speaking to reporters in Europe earlier this week.

Smith was in Europe to give an address at a conference in Brussels, where the European Union is currently in the final stages of negotiating its landmark A.I. Act. And Smith’s speech was about how policy positions Microsoft recently announced on A.I. regulation might align with the A.I. Act, and also some things the company would like to see in the law’s final version.

As I’ve written about previously for Fortune, Microsoft is pushing a package of policies. Some of the points in the company’s A.I. regulation blueprint include what it calls “safety brakes” on the use of A.I. in critical infrastructure, like power grids and city traffic management systems, that would allow humans to quickly take back control from an automated system if something began to go awry. It has also called for regulations that would be similar to “know your customer” rules in financial services, where those making A.I. software would be required to “know your cloud, know your customer, and know your content.” Microsoft has come out in favor of licensing requirements for those creating “highly capable” foundation models and running large data centers where such models are trained. While safety brakes, licensing, and Smith’s KYC rules are not in the EU A.I. Act, he said in his speech and an accompanying blog post that implementing such rules would “be key to meeting both the spirit and obligations of the act.”

What was new in Smith’s discussion with reporters were some of the analogies he used. He said he particularly favored comparisons between international A.I. regulation and civil aviation as a way to think about international governance of A.I. He pointed to the International Civil Aviation Organization, which was created in 1944 and is based in Montreal, as a possible model. Almost every nation on Earth, 192 in total, is a member of the organization. “The ICAO has fostered a common approach to safety standards,” Smith said. “Both the testing of safety when the aircraft are being designed and built, and their application and an ongoing monitoring of their safety is very similar when you step back and think about it to what people are talking about for foundational models.”

There are a couple of other things about the aviation analogy that strike me as interesting. For instance, civil aviation sounds fairly innocuous. We all want airplanes that can fly seamlessly across borders—and that also don’t fall out of the sky. It sounds a lot less tricky to manage than say, nuclear power, which is the analogy that people such as OpenAI’s Sam Altman and A.I. researcher Gary Marcus have been floating, suggesting that we may need an organization similar to the International Atomic Energy Agency to police the use of powerful A.I. systems. In Altman’s telling, A.I. is a technology that could, if not handled carefully, bring about the destruction of the world, just like nuclear power. Framing A.I. regulation by analogy to nuclear power immediately conjures up images of mushroom clouds and Armageddon. Comparing powerful A.I. to commercial aircraft, not so much.

Of course, the big criticism of what Smith and Microsoft have called for in their regulatory blueprint is that the strict KYC and licensing requirements will be extremely difficult for open-source A.I. developers to comply with. In other words, what Microsoft is proposing amounts to a form of “regulatory capture,” where the Big Tech companies set the rules in such a way that they can comply, yet the same rules act as a moat, keeping smaller startups and independent software developers from competing.

I asked Smith about this. In response, he trotted out a different analogy. This time not to the airplane—but to the automobile. We require, for public safety, that anyone who wants to drive a car gets a license first. It hasn’t stopped millions of people from taking driver’s education classes and obtaining their licenses. So just because you require those building A.I. foundation models to get a license, Smith said, it doesn’t mean that open-source developers would be shut out of the game.

“We should require everyone who participates in a manner that could have an implication for safety, to put safety first,” Smith said. “But we all had to find a way to do it that can accommodate different models and I would imagine that we can.” As for regulatory capture, he pointed out that, at the end of the day, companies don’t write the rules, governments do. “And I would hope that they will find a way to write the rules in a way that can accomplish more than one goal at a time,” he said.

Of course, there might be a good reason the airplane isn’t the metaphor Smith wanted to use in the context of talking about competition. Because, while 192 countries have agreed on aviation standards, there are actually only a handful of companies that have been able to effectively compete in selling commercial airliners, and that’s, at least in part, because meeting those standards is really hard.

I asked Smith if foundation models would be able to comply with the EU’s existing data privacy law, known as GDPR, as the A.I. Act says they must. Many experts say it will be difficult for these models to meet that requirement because they are often trained on such vast amounts of data, scraped from the internet, that it is hard for the creators to guarantee that no personal information has been swept up in the data haul. Others have also questioned whether companies such as OpenAI, Microsoft, and Google have a proper legal basis for handling sensitive personal information fed into the systems by users of their A.I.-powered chatbots and search engines.

Here Smith framed A.I. regulation in terms of the automobile again but in a different way. “I think one can think of the foundational model as the engine for a car,” he said. “And then you can think of the [A.I.] application as the car itself. And so you can’t have a safe car without a safe engine. But it takes a lot more to have a safe car than to just have a safe engine. And so when it comes to a lot of practical safety requirements, you have to decide: Are we talking about the car or are we talking about the engine? And I think a lot of the GDPR issues that people are focused on today probably involve the car as much as the engine.”

If that leaves you scratching your head as to whether the foundation model itself—the engine in Smith’s analogy—will be GDPR compliant, you aren’t alone. But what was clear is that Microsoft wants you to think about A.I. in terms of familiar, mostly benign technologies of the past—cars and airlines—not more exotic and scary ones, such as nuclear energy or genetic engineering, as some others have suggested. We’ll see which narrative wins over the coming months.

With that, here’s some more A.I. news from the past week.

Jeremy Kahn
jeremy.kahn@fortune.com
@jeremyakahn

A.I. IN THE NEWS

Top European executives sign letter critical of EU A.I. Act draft. More than 150 executives from major European companies, including Renault, Heineken, Airbus, and Siemens, have criticized the European Union's recently approved Artificial Intelligence Act in an open letter, tech publication The Verge reported. The executives argue that the A.I. Act, which has been in development for two years and was approved by the European Parliament on June 14, could negatively impact Europe’s technological competitiveness and sovereignty. The signatories say the act's strict rules, particularly for generative A.I. systems, could lead to onerous compliance costs and liability risks, causing A.I. providers to withdraw from the European market. They urge the EU to focus on a risk-based approach and form a regulatory body of A.I. experts to monitor the law’s application. But Dragoș Tudorache, a member of the European Parliament who led the development of the A.I. Act, defended the legislation, stating that it provides all of the things that the executives say they want: an industry-led process for defining standards and a light regulatory regime that emphasizes transparency.

A.I. can’t win a Grammy, but it could help. The music industry’s top awards program has updated its eligibility criteria to state that only human creators can be considered for nominations, according to TechCrunch. A.I.-assisted compositions, however, can be eligible if the human contribution is "meaningful," and pertains to the category for which the song is submitted, the Grammys said. The policy puts A.I. tools in the same camp as pedals and sound filters.

A.I. startup Inflection AI raises record $1.3 billion funding round. In a sign of just how frothy the generative A.I. boom has become, startup Inflection AI has raised $1.3 billion in a new funding round from notable investors such as Microsoft, Nvidia, Eric Schmidt, and Bill Gates, bringing its total venture funding to date to $1.5 billion. The funds will support the further development of Inflection’s chatbot Pi. Right now, Pi is designed to be an empathetic listener and good conversationalist, and not much else. But the company, cofounded by Mustafa Suleyman, one of the cofounders of A.I. research lab DeepMind, and billionaire Reid Hoffman, has said its ambition is to build a powerful A.I. assistant that could function as every person’s own “chief of staff.” This kind of digital assistant is seen by many, including Gates, as the future of human-computer interaction. To achieve that goal, Inflection says it plans to build an A.I. supercomputer in collaboration with Nvidia and CoreWeave. You can read the full story from me and my Fortune colleague Rachel Shin here.

OpenAI opens a London office and hires a European lobbyist. The San Francisco-based creator of ChatGPT has selected London as the location for its first corporate office outside the U.S. as part of its ongoing expansion efforts, Bloomberg reported. OpenAI said it will be hiring for research, engineering, and business roles in London, a city acknowledged by Altman for its "world-class talent.” Meanwhile, the company also hired a new lobbyist for Europe, Sandro Gianella, who had previously been a top lobbyist for payments company Stripe and worked on Google’s European public policy team, according to a story in Politico.

U.S. plans further restrictions on A.I. chip sales to China. The Biden Administration plans to tighten export controls on the sale of certain artificial intelligence chips to China, in response to growing concerns about selling the powerful technology to a geopolitical rival, Bloomberg reported. The new rules, expected in July, would expand on restrictions initially introduced last October. They would be designed to make it harder to sell some chips, such as Nvidia’s A800 chip, to China without a license. Nvidia's CFO, Colette Kress, noted that China represents about 20% to 25% of Nvidia's data center revenue, and while overall demand for their products would mitigate any impact on earnings, a long-term export ban would represent a lost opportunity. The decision is reflective of the administration's determination to contain China's technological rise and could escalate tensions between the two countries, the news service said.

Microsoft adds A.I.-powered shopping tools to Bing. Microsoft has introduced a range of A.I.-powered shopping tools for its Bing search engine and Bing AI chatbot, such as automatically-generated buying guides and review summaries for queries like "college supplies,” according to a story in TechCrunch. The buying guides are now available in the U.S​., with a worldwide rollout of the feature planned for the coming months. The A.I.-generated buying guides match a feature that Google has built into its experimental Search Generative Experience. Microsoft also launched Price Match, a tool that assists users in requesting a price match from a retailer.

OpenAI adds search to ChatGPT, but it has to be with Bing. OpenAI has added a “Browse with Bing” feature to its premium ChatGPT Plus app, which allows the A.I. to use Microsoft’s search engine to peruse the internet for answers to questions, Tech Crunch reports. This feature can be enabled in the settings of both the iOS and Android versions of the ChatGPT app. It's designed to help with queries related to current events or information beyond the bot's original training data, which does not contain anything that happened after December 2021. However, some people have criticized OpenAI’s decision to only offer Bing as the search engine because of fears about bias in Bing's search results and the lack of alternative search engines for users, according to TechCrunch.

About the Author
Jeremy Kahn
By Jeremy KahnEditor, AI
LinkedIn iconTwitter icon

Jeremy Kahn is the AI editor at Fortune, spearheading the publication's coverage of artificial intelligence. He also co-authors Eye on AI, Fortune’s flagship AI newsletter.

See full bioRight Arrow Button Icon

Latest in Newsletters

Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025

Most Popular

Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Rankings
  • 100 Best Companies
  • Fortune 500
  • Global 500
  • Fortune 500 Europe
  • Most Powerful Women
  • Future 50
  • World’s Most Admired Companies
  • See All Rankings
Sections
  • Finance
  • Leadership
  • Success
  • Tech
  • Asia
  • Europe
  • Environment
  • Fortune Crypto
  • Health
  • Retail
  • Lifestyle
  • Politics
  • Newsletters
  • Magazine
  • Features
  • Commentary
  • Mpw
  • CEO Initiative
  • Conferences
  • Personal Finance
  • Education
Customer Support
  • Frequently Asked Questions
  • Customer Service Portal
  • Privacy Policy
  • Terms Of Use
  • Single Issues For Purchase
  • International Print
Commercial Services
  • Advertising
  • Fortune Brand Studio
  • Fortune Analytics
  • Fortune Conferences
  • Business Development
About Us
  • About Us
  • Editorial Calendar
  • Press Center
  • Work At Fortune
  • Diversity And Inclusion
  • Terms And Conditions
  • Site Map
  • Facebook icon
  • Twitter icon
  • LinkedIn icon
  • Instagram icon
  • Pinterest icon

Latest in Newsletters

NewslettersCIO Intelligence
How Expedia’s CTO is using AI to transform work for 17,000 employees—and travel for millions
By John KellJanuary 14, 2026
17 hours ago
NewslettersMPW Daily
Two of the world’s biggest podcasters went viral talking about why women are having fewer children. Here’s what they got wrong
By Ellie AustinJanuary 14, 2026
19 hours ago
NewslettersCFO Daily
JPMorgan CEO and CFO: Staying competitive requires investment
By Sheryl EstradaJanuary 14, 2026
22 hours ago
NewslettersTerm Sheet
What 2026 holds for the future of work
By Allie GarfinkleJanuary 14, 2026
23 hours ago
OnePlus CEO Pete Lau in Mumbai on June 22, 2017. (Photo: Punit Paranjpe/AFP/Getty Images)
NewslettersFortune Tech
Taiwan issues arrest warrant for OnePlus CEO
By Andrew NuscaJanuary 14, 2026
23 hours ago
NewslettersCEO Daily
Leaders are increasingly worried about an economic downturn, inflation, and an asset bubble bust
By Diane BradyJanuary 14, 2026
1 day ago

Most Popular

placeholder alt text
Personal Finance
Peter Thiel makes his biggest donation in years to help defeat California’s billionaire wealth tax
By Nick LichtenbergJanuary 14, 2026
17 hours ago
placeholder alt text
Success
Despite his $2.6 billion net worth, MrBeast says he’s having to borrow cash and doesn’t even have enough money in his bank account to buy McDonald’s
By Emma BurleighJanuary 13, 2026
2 days ago
placeholder alt text
AI
'Godfather of AI' says the technology will create massive unemployment and send profits soaring — 'that is the capitalist system'
By Jason MaJanuary 12, 2026
3 days ago
placeholder alt text
AI
Being mean to ChatGPT can boost its accuracy, but scientists warn you may regret it
By Marco Quiroz-GutierrezJanuary 13, 2026
2 days ago
placeholder alt text
Future of Work
'Microshifting,' an extreme form of hybrid working that breaks work into short, non-continuous blocks, is on the rise
By Nick LichtenbergJanuary 13, 2026
2 days ago
placeholder alt text
Economy
Goldman Sachs top economist says Powell probe won’t change the Fed: 'Decisions are going to be made based on employment and inflation'
By Sasha RogelbergJanuary 12, 2026
3 days ago

© 2025 Fortune Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | CA Notice at Collection and Privacy Notice | Do Not Sell/Share My Personal Information
FORTUNE is a trademark of Fortune Media IP Limited, registered in the U.S. and other countries. FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.