Over the years, Apple has crafted a reputation for protecting privacy among data-hungry and growth-seeking tech companies.
In a multi-platform ad campaign, the company told consumers “what happens on your iPhone, stays on your iPhone,” and equates its products with security through catchphrases like “privacy.” That’s the iPhone. “
But experts say that while Apple sets the bar for hardware and, in some cases, software security, the company can do more to keep user data from falling into the hands of police and other authorities.
In recent years, U.S. law enforcement agencies have increasingly used data collected and stored by tech companies for investigations and prosecutions. Experts and civil liberties advocates have raised concerns about authorities’ widespread access to consumers’ digital information, warning it could violate Fourth Amendment protections against unreasonable searches. Those concerns will only intensify as protected practices like abortion are criminalized in many states.
“The more a company like Apple can do to either not accept law enforcement requests or be able to say they can’t comply with those requests by using tools like end-to-end encryption, the better it will serve the company,” the digital advocacy group said in a statement. Saitlin Seeley George, Events and Managing Director of Fight for the Future.
Apple gives data to law enforcement 90% of the time

According to its own transparency report, Apple receives thousands of law enforcement requests for user data each year, and cooperates with them overwhelmingly.
In the first half of 2021, Apple received 7,122 law enforcement requests in the United States for account data on 22,427 people. According to the company’s most recent transparency report, Apple submits some level of data in response to 90 percent of requests. Of those 7,122 requests, the iPhone maker challenged or denied 261 requests.
The company’s positive response rate is roughly in line with peers like Facebook and Google, and sometimes slightly higher. However, the two companies have recorded far more authority requests than the iPhone makers.
In the second half of 2021, Facebook received nearly 60,000 law enforcement requests from U.S. authorities and provided data in 88 percent of cases, according to the company’s latest transparency report. During the same period, Google received 46,828 law enforcement requests, affecting more than 100,000 accounts, and handed over some level of data in response to more than 80 percent of requests, according to the search giant’s transparency report. That’s more than six times the number of law enforcement requests Apple has received in a comparable time frame.
That’s because the amount of user data Apple collects dwarfs that of other players in the space, said Jennifer Golbeck, a professor of computer science at the University of Maryland. She noted that Apple’s business model relies less on marketing, advertising and user data — operations based on data collection. “They naturally don’t analyze people’s data the way Google and many other places do,” she said.
Apple has drawn up detailed guidelines outlining exactly what data authorities have access to and how — a level of detail that the company says is in line with best practice.
iCloud and other services are at risk despite hardware ‘safety’
But privacy advocates say significant gaps remain.
While iMessages sent between Apple devices are end-to-end encrypted so no one but the sender and recipient can access it, not all messages backed up to Apple’s cloud server iCloud have the same encryption level.
Apple’s law enforcement guidance states that “iCloud content, if it exists in a customer’s account,” can be turned over to law enforcement under a search warrant. This includes everything from a detailed log of the time, date, and recipients of emails sent in the past 25 days, to “Stored Photos, Documents, Contacts, Calendars, Bookmarks, Safari Browsing History, Maps Search History, Messages, and iOS Device Backups” ” of all content.Backups of the device itself may include “photos and videos from the camera roll, device settings, application data, iMessage, business chats, text and MMS messages” [multimedia messaging service] messages and voicemail,” according to Apple.
Golbeck is an iPhone user, but she chose not to use iCloud because she fears the system is vulnerable to hacking and law enforcement requests. “I’m one of those people who, if someone asks if they should get an Android or an iPhone, I think, well, the iPhone is going to be more protective than Android, but the bar is low,” she said.
“[Apple’s] Hardware is the most secure on the market,” echoed Albert Fox Cahn, founder of the Surveillance Technology Oversight Project, a privacy rights group. But the company’s policies around iCloud data also worried him: “I had to spend a lot of time opting out of what they were trying to automate. Pushing me to use things that were supposed to make my life better, but actually just put me at risk.
“As long as Apple continues to limit privacy to hardware design issues, rather than focusing on the entire lifecycle of data and the full range of threats to government surveillance, Apple will fail,” he argued.
That’s a double standard, Kahn said, and Apple’s stance in its most high-profile privacy case, the 2015 mass shooting in San Bernardino, California, was already evident.
At the time, Apple refused to comply with the FBI’s request to create a backdoor to access the shooter’s locked iPhone. The company argues that in future cases, hackers and law enforcement officials could take advantage of security bypasses.
But the company said in court filings that if the FBI hadn’t changed the phone’s iCloud password, it wouldn’t have needed to create the backdoor because all data was backed up and therefore available through a subpoena.
In fact, the company said that prior to that, Apple had “provided all the data it had related to the attacker’s account.”

“They know they don’t want to break into their iPhones, but they’re eager to actually break into iCloud backups,” Kahn said.
In a statement, Apple said it believes privacy is a fundamental human right, arguing that users always have the right to opt out when companies collect their data.
“Our products include innovative privacy technologies and techniques designed to minimize the amount of data that we or anyone else can access,” said Trevor Kincaid, an Apple spokesman, adding that the company provides new tools for app tracking transparency and other new Proud privacy features and Mail Privacy Protection give users more control over the information shared with third parties.
“Wherever possible, data is processed on the device and in many cases we use end-to-end encryption. Where Apple does collect personal information, we are clear and transparent about this, telling users about their data how it is used and how to opt out at any time.”
Kincaid added that Apple reviews all legal requirements and is obligated to comply when they are in effect, but stressed that the personal data Apple collects is limited to begin with. For example, the company encrypts all health data and does not collect device location data.
People ‘have absolutely no idea what’s going on with their data’
Meanwhile, privacy advocacy groups such as the Electronic Frontier Foundation (EFF) are pressing Apple to implement end-to-end encryption for iCloud backups.
“When we say they’re better than everyone else, it’s more of an indictment of what everyone else is doing, not necessarily that Apple is particularly good,” said Erica Portnoy, an EFF employee technologist.
Portnoy applauds Apple’s default protection for some services like iMessage. “In some ways, some default settings could be better [than other companies]that’s fine,” she said. But, she noted, messages are only secure when they’re sent between iPhones.
“We know that unless messages are encrypted end-to-end, many people have access to these communications,” said George, whose group Fight for the Future has launched a campaign to push Apple and other companies to better protect their message system.
George believes it’s a problem that companies can solve, for example, with a Google-backed messaging system called Rich Communication Services (RCS). The system itself is not end-to-end encrypted, but supports encryption, unlike SMS and MMS, and allows Apple to protect messages between iPhone and Android, she said.
At the Code 2022 tech conference, Apple CEO Tim Cook said the company does not plan to support RCS, arguing that users have not indicated it is a priority. But they “don’t know what RCS is,” George said. “If Apple really didn’t want to use RCS because it came from Google, they might come up with other solutions that show a sincere effort in protecting people’s information.”
Consumers are not asking for other messaging services because there are many existing encryption products, such as Signal, Kincaid said. He also said that Apple is concerned that RCS is not a modern standard or encryption by default.
Golbeck, who owns a TikTok channel on privacy, said people “have absolutely no idea what’s going on with their data” and “think they have some privacy they don’t have.”
“We really don’t want our own equipment to become a surveillance tool for the state,” Golbeck said.