Apple is making a big change in the way it is recording—and improving—Siri. And it did it in typical Apple fashion.
In a statement on Wednesday, Apple apologized for not “fully living up to our high ideals.” The company said that it is changing its practice of having outside contractors listen to recordings made by Siri users, which was previously a step in its evaluation process for the virtual personal assistant technology.
Under Apple’s new policy, the company will instead use computer-generated transcriptions to evaluate Siri’s performance. In addition, the company will no longer keep audio recordings it has collected.
Apple will is now also asking users willing to participate in improving Siri to opt-in to the voice recordings. The company says it will only allow Apple employees to listen to those recordings.
Apple’s announcement follows a report from The Guardian in May that said the iPhone maker had hired contractors to listen to the commands its users had voiced to Siri. Contractors for the company would then grade the software’s responses to the queries. This practice was officially ended by Apple three months later.
While Apple had anonymized the data, there were also concerns that sensitive information could have been recorded during the process. For instance, one source told The Guardian that the recordings included people discussing drug deals and having sexual intercourse.
Apple has not been alone in recording voice assistant queries and relying on humans to review them. Investigations also found that Amazon, Google, Microsoft, and Facebook have similarly used human reviewers to improve their services. All of those companies have banned human reviews in recent weeks.
However, Apple’s new policy separates itself from its competition by turning off recording by default. Users who approve of having their recordings analyzed must opt-in to the evaluation program. By contrast, users of other voice assistants are recorded by default. And in order to keep their voice commands fully private, they need to track down and deactivate the recordings.
Moving forward, this policy change will help Apple draw comparisons between itself and its competition. Instead of going on the defensive, Apple has made itself the protagonist in a battle to protect user privacy, a play that is similar to ones the company has run in the past.
For example, in 2015, Apple took up the mantle as a strong privacy advocate by not allowing the FBI to access the data stored on the San Bernardino shooter’s iPhone. Law enforcement officials railed against Apple’s decision, but the iPhone maker argued that the security of its platform and users’ data mattered most.
Historically, the iPhone maker has also come under fire from app developers for having too much control over the programs that make their way to the company’s iOS devices. Apple, which unilaterally approves, denies, or removes apps from its App Store, has turned that criticism into an argument for higher standards ensuring a more secure product for its customers.
And so it goes again, this time with Siri. Whether the new policy will affect its performance remains to be seen. Apple said that its previous policy collected fewer than 0.2% of Siri requests, which Apple said is enough for the company to analyze the assistant’s performance the artificial intelligence to do better.
More must-read stories from Fortune:
—How Reliance Jio became India’s wireless wonder
—Google is cracking down on internal political debates
—Apple card review: A (mostly) rewarding way to pay
—No humans needed: Chinese company uses A.I. to read books and the news
—ProPublica: How Amazon and Silicon Valley seduced the Pentagon
Catch up with Data Sheet, Fortune’s daily digest on the business of tech.