Apple has a terrific reputation when it comes to security. That’s why it was such a shock to learn last month that hackers found a way to break in to the company’s famous iPhones, and even take over the camera and microphone features without a user even knowing it.

Apple released a software patch on Aug. 25 that users could download to protect their iPhones from the sinister spyware known as “Pegasus.” The patch process, however, took the company a full 10 days to finish after security researchers tipped off the company about the problem. Given the gravity of the situation, did Apple drag its feet?

Based on conversations with those familiar with the events, Apple did exactly what it should have done. But the Pegasus scare shows how hard it is for companies to respond when their software is compromised, and why Apple and mobile computing may never be safe again.

10 Days of Pegasus

Mike Murray leads research at Lookout, a security company in San Francisco that specializes in threats to mobile devices. He was part of the team that uncovered how a shadowy company called NSO Group had created the Pegasus hack and sold access to it to a nasty band of customers across the globe.

As Lookout and Citizen Lab, an academic team in Toronto, reported in blog posts, the Pegasus discovery came after a human rights activist forwarded a screenshot of a suspicious link he had received via text message. In a piece of good luck, a Lookout executive promptly activated the link on an iPhone to see what it would do—as Murray explained, the decision to test it quickly proved important since the link was built to time out after 30 minutes.

The researchers soon realized they had stumbled on a powerful weapon to invade an iPhone. They worked through a weekend to figure out just how Pegasus worked—and then it was time to tell Apple aapl .

After the alert went out, Murray says Apple embarked on an urgent three-phase process over 10 days to defeat Pegasus.

“The first three or four days was to figure out how all the exploits worked, where the vulnerability was in the code, and preparing for the fixes that would be made,” Murray told me. “Then three days to fix it and prepare for the QA.”

The QA (quality assurance), it turns out, is the most critical part of the process in these situations. The reason is that if Apple got it wrong it could open the door to a whole new wave of vulnerabilities released out into the wild.

Get Data Sheet, Fortune’s technology newsletter

The QA process is also complicated. It involves preparing variations of the software patch that might vary for different phone carriers, and then working with those carriers to send the patches for customers to download.

“It would have taken three days. They probably worked around the clock on the QA,” said Murray.

Apple declined to comment for this story but a person close to the company said Murray’s account of the three-phase process over 10 days is accurate.

The upshot is that, even for a company with the resources of Apple, serious security problems can take a relatively long time to repair, and there are few shortcuts. For those who might insist there is a quicker way, Murray cited the familiar adage that you can’t put nine women in a room and make a baby in a month.

Age of Mobile Attacks Is Here

Apple’s response to Pegasus provides insight into the patching process that occurs when a big company discovers its software is exposed to an attack. But the overall episode is also notable because it shows how hackers are treating our phones like the computers they are, and that security is elusive.

“This changes mobile. For the first time, iOS is vulnerable—people can no longer rely on ‘Apple will protect me,'” said Murray

He added that Pegasus is notable because most of the big security scares involving mobile have until now been theoretical—whenever someone has discovered a major vulnerability, there typically is little evidence the exploit was widely used for nefarious purposes.

The Pegasus exploit was different in that not only did hackers find a weakness in iOS, they used it to create a potent cyber weapon they sold across the globe. (They also found a similar vulnerability for OS X, the software that runs Apple computers, which has now also been fixed.)

Given that serious vulnerabilities take days or weeks to fix, and that mobile phones are an indispensable tool for nearly everyone, the importance of so-called bug bounty programs for cell phones is likely to grow.

These programs, which involve companies paying hackers to disclose software vulnerabilities, are becoming nearly universal—even Apple, a longtime hold, finally announced the creation of a bug bounty system last month (and already a private firm said it will pay more for the same information).

But overall consumers may have to get used to the idea that no phone, even those made by Apple, is secure and that, even when exploits are discovered, there is no quick way to fix them.