Tom Siebel, CEO of, discusses failure and the future after his company’s soaring IPO

Our mission to make business better is fueled by readers like you. To enjoy unlimited access to our journalism, subscribe today.

Nine out of 10 companies fail when implementing artificial intelligence software, says Tom Siebel, the billionaire founder and chief executive of And in that failure Siebel has found opportunity.

C3, which sells software that enables large companies to implement and manage large A.I. applications, saw its shares soar more than 150% in their first day of trading on the New York Stock Exchange Wednesday. The company raised $651 million in the initial public offering that saw the business valued at more than $10 billion.

“Virtually every one of our customers failed one, two, or three times” trying to build their own A.I. systems, Siebel told Fortune hours after the IPO. He says this is the same pattern he had seen earlier in his career, first selling database software in the 1980s at Oracle, and later building Siebel Systems, the sales-force automation and customer relationship software company he sold to Oracle for $5.85 billion in 2005. “History doesn’t repeat itself, but it sure rhymes,” he says. “These companies try a few times. They fire the CIO, and then they get serious and get the job done.”

C3, which Siebel founded in 2009, saw its revenues leap 71% to $157 million in the 12 months through April 2020. But the company’s expenses, particularly its research and development costs and its sales and marketing spending, are growing even faster. As a result, the company lost $70 million in the same period.

“We are building a structurally profitable, structurally cash positive business,” Siebel says. But, he notes, investing in the company’s growth means that it will continue to lose money on an operating basis for the next few years. He says that the cash flow should turn positive three to four years from now and that the business should be able to generate profit margins in excess of 20% in the long run.

The company has about 50-odd customers but generates a large portion of its sales, 44%, from just three of them: oil services firm Baker Hughes, French energy company Engie, and industrial equipment maker Caterpillar. Siebel says that the firm is rapidly diversifying and that he knows of at least 50 additional customers the company was working on closing in the next six months.

The CEO says he sees the biggest market for A.I. software in health care, where he predicts it will help usher in a revolution in personalized medicine, helping doctors to determine which patients are most likely to develop certain diseases and intervene earlier to prevent them. He says it will also help with more targeted treatments for cancer and other conditions.

Siebel, who is nothing if not outspoken, says the incoming Biden administration should set guidelines for A.I. companies, particularly around A.I. ethics. “There are many cases where A.I. is being used, particularly by social media, to, I think, enormous social detriment,” he says. He points to mental health issues in young people that are caused or exacerbated by social media as well as political polarization and disinformation as areas where the government should step in and regulate technology. He also says regulation is needed on how companies could use personally identifiable information in training A.I software.

He urges the incoming Biden administration to commit more resources to using A.I. in military and intelligence applications, particularly in light of China’s multiyear, multibillion-dollar investment in these technologies. “Make no mistake, we’re at war with China in artificial intelligence,” he says. has major contracts with the U.S. Air Force, for which it has built a system that predicts when aircraft parts will need to be replaced, helping the force keep more of its planes ready to fly, as well as with Raytheon, a major defense contractor. And while providing A.I. systems to the U.S. military has proved controversial for some tech companies—with Google pulling out of the Pentagon’s Project Maven in 2018 following an uproar among its employees—Siebel says C3 has no issue providing A.I. technology to the U.S. military, so long as a human remains “in the loop” in any system the company helps deploy. “We’re proud to serve democratic governments and governments that support human rights and individual liberty, and we will continue to do so,” he says.

And while some A.I. researchers have also objected to providing systems to oil companies that continue to extract hydrocarbons, Siebel tells Fortune he has no problem providing A.I. software to large energy companies to help them become more efficient. “What we’re doing for some of the largest utilities in the world, and some of the largest oil and gas companies in the world, is that we’re allowing them to reinvent themselves,” he says. “To help them convert themselves to safe energy, secure energy, lower-cost energy, and much, much higher reliability clean energy.”

In general, Siebel says, A.I. ethics is too important to delegate to an A.I. ethics officer or an A.I. ethics department. “That’s just a cop-out,” he says. “The CEO needs to own this, the whole management team needs to own this, and the board.”

This story has been updated to correct the exchange on which’s shares are traded.

Subscribe to Well Adjusted, our newsletter full of simple strategies to work smarter and live better, from the Fortune Well team. Sign up today.

Read More

Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward