In 2021, Apple was embroiled in controversy over plans to scan iPhones for child sexual abuse material. Privacy experts warned that governments could abuse the system, and the backlash was so intense that Apple eventually abandoned the plan.
Two years later, Apple is facing criticism from child safety crusaders and activist investors who are calling on the company to do more to protect children from online abuse.
A child advocacy group, the Heat Initiative, has raised $2 million for a new national advertising campaign calling on Apple to detect, report and remove child sexual abuse material from iCloud, its cloud storage platform this.
Next week, the group will release digital advertisements on websites popular with Washington policymakers, such as Politico. It will also put up posters across San Francisco and New York saying: “Child sexual abuse material stored on iCloud. Apple allows it.”
The criticism speaks to a predicament that has dogged Apple for years. The company made protecting privacy a central part of its iPhone pitch to consumers. But that promise of security has helped make its services and devices, two billion of which are in use, useful tools for sharing images of child sexual abuse.
The company is caught between child safety groups, who want it to do more to stop the spread of such materials, and privacy experts, who want it to keep its promise of secure devices.
A group of two dozen investors with nearly $1 trillion in assets under management also called on Apple to publicly report the number of abusive images it collects on its devices and services.
Two investors — Degroof Petercam, a Belgian asset manager, and Christian Brothers Investment Services, a Catholic investment firm — will submit a shareholder proposal this month that would require Apple to provide a detailed report on how effective its security tools are at protecting for the kids.
“Apple seems to be stuck between privacy and action,” said Matthew Welch, an investment specialist at Degroof Petercam. “We thought a proposal would wake up management and they would take it more seriously.”
Apple has been quick to respond to child safety advocates. In early August, its privacy executives met with a group of investors, Mr. Welch said. Then, on Thursday, the company responded to an email from the Heat Initiative with an accompanying letter which defended its decision not to scan iCloud. It shared the letter with Wireda technology publication.
In Apple’s letter, Erik Neuenschwander, its director for user privacy and child safety, said the company had determined it was “impractically possible” to scan iCloud photos without “compromising security and privacy of our users.”
“Scanning for one type of content, for example, opens the door for mass surveillance and can create a desire to find other encrypted messaging systems,” said Mr. Neuenschwander.
Apple, he added, has created a new default feature for all child accounts that intervenes with a warning if they receive or try to send nude photos. It is designed to prevent the creation of new child sexual abuse material and limit the risk of predators coercing and blackmailing children for money or nude photos. It also made those tools available to app developers.
By 2021, Apple said it will use a technology called image hash to detect abusive material on iPhones and in iCloud.
But the company failed to communicate that plan widely with privacy experts, fueling their skepticism and fueling concern that governments could abuse the technology, said Alex Stamos, the director of Stanford Internet Observatory at the Cyber Policy Center, which opposed the idea.
Last year, the company cautiously abandoned its plan to scan iCloud, which shocked child safety groups.
Apple has won praise from both privacy and child safety groups for its efforts to stop the creation of new nude photos in iMessage and other services. But Mr. Stamos, who applauded the company’s decision not to scan iPhones, said it could do more to stop people from sharing problematic photos in the cloud.
“You can have privacy if you keep something for yourself, but if you share something with other people, you don’t get the same privacy,” Mr. Stamos said.
Governments around the world are putting pressure on Apple to act. Last year, The eSafety Commissioner in Australia has released a report criticized Apple and Microsoft for not doing more to proactively police their services for abusive material.
In the United States, Apple created 160 reports in 2021 at the National Center for Missing and Exploited Children, a federally designated clearinghouse for abusive material. Google generated 875,783 reports, while Facebook generated 22 million. These reports do not always reveal truly abusive material; some parents have had their Google accounts suspended and reported to the police for non-criminal photos of their children.
The Heat Initiative timed its campaign ahead of Apple’s annual iPhone unveiling, scheduled for September 12. The campaign is led by Sarah Gardner, formerly vice president for external affairs at Thorn, a nonprofit founded by Ashton Kutcher and Demi Moore to fight child sexual abuse online. Ms. Gardner has raised money from several child safety supporters, including the Children’s Investment Fund Foundation and the Oak Foundation.
The group is formed a website documenting law enforcement cases after which iCloud is named. Included in the list child pornography charges filed against a 55-year-old in New York which has more than 200 photos stored in iCloud.
said Ms. Gardner said the Heat Initiative plans to target advertising throughout the fall to places where it will encounter Apple customers and employees. “The goal is to keep running those tactics until Apple changes its policy,” Ms. Gardner.
Kashmir Hill contributed reporting.