Almost every week, Brian Levine, a computer scientist at the University of Massachusetts Amherst, is asked the same question by his 14-year-old daughter: Can I download this app?
Mr. Levine responded by scanning hundreds of customer reviews on the App Store for allegations of child molestation or sexual abuse. The manual and arbitrary process made her wonder why more resources weren’t available to help parents make quick decisions about apps.
For the past two years, Mr. Levine has sought to help parents by designing a computational model that analyzes customer reviews of social apps. Using artificial intelligence to analyze the context of reviews with words like “child porn” or “pedo,” he and a team of researchers developed a searchable website called the App Danger Projectwhich provides clear guidance on the safety of social networking apps.
The website records user reviews of sexual predators and provides safety assessments of apps with negative reviews. It lists reviews that mention sexual abuse. Although the team did not follow up with reviewers to verify their claims, it read each one and excluded those that did not highlight child safety concerns.
“There are tests out there that talk about the kind of dangerous behavior that’s going on, but the tests are drowned out,” Mr. Levine said. “You can’t find them.”
Predators are increasingly using apps and online services to collect explicit images. Last year, Law enforcement has received 7,000 reports of children and teenagers forced to send nude photos and then blackmailed for photos or money. The FBI declined to say how many of those reports were credible. The incidents, called sextortion, have more than doubled during the pandemic.
Because Apple and Google’s app stores don’t offer keyword searches, Mr. Levine said, it can be difficult for parents to find warnings of inappropriate sexual behavior. He envisions the App Danger Project, which is free, complementing other services that review the suitability of products for children, such as Common Sense Media, by identifying apps that aren’t doing enough for users of the police. He has no plans to make a profit from the site but is encouraging donations to the University of Massachusetts to offset its costs.
Mr. Levine and a dozen computer scientists investigated the number of reviews warning of child sexual abuse on more than 550 social networking apps distributed by Apple and Google. They found that a fifth of those apps had two or more complaints of child sexual abuse material and that 81 offers across the App and Play stores had seven or more of these types of reviews. .
Their investigation was based on previous reports of apps with complaints of unwanted sexual interactions. In 2019, The New York Times detailed how predators treat video games and social media platforms as hunting grounds. The separate reported that year by The Washington Post found thousands of complaints across six apps, leading Apple to remove the Monkey, ChatLive and Chat for Strangers apps.
Apple and Google have a financial interest in distributing apps. The tech giants, which take up to 30 percent of app store sales, helped three apps with multiple user reports of sexual abuse generate $30 million in sales last year: Hoop, MeetMe and Whisper, according to Sensor Tower, a market research firm.
In more than a dozen criminal cases, the Justice Department has described those apps as tools used to solicit sexual images or meetings with children — Hoop in Minnesota; Meet Me in California, Kentucky and Iowa; and Whisper Illinois, Texas and Ohio.
Mr. Levine said Apple and Google should give parents more information about the dangers posed by some apps and better police those with a track record of abuse.
“We’re not saying that every app that has reviews that say child predators is in it, but if they have the technology to check it, why are some of these problematic apps still in the stores?” he asked. by Hany Farid, a computer scientist at the University of California, Berkeley, who worked with Mr. Levine on the App Danger Project.
Apple and Google say they regularly scan user reviews of apps using their own computational models and investigate allegations of child sexual abuse. When apps violate their policies, they are removed. Apps have age ratings to help parents and children, and the software allows parents to veto downloads. Companies also offer tools to app developers for policing child sexual material.
A spokesperson for Google said the company investigated the apps listed by the App Danger Project and found no evidence of child sexual abuse material.
“While user reviews play an important role as a signal to trigger further investigation, allegations from reviews are not reliable enough on their own,” he said.
Apple also investigated apps listed by the App Danger Project and removed 10 that violated its rules for distribution. It declined to provide a list of those apps or the reasons why it acted.
“Our App Review team works 24/7 to carefully review every new app and app update to ensure it meets Apple’s standards,” a spokesperson said in a statement.
The App Danger project said it found a significant number of reviews suggesting Hoop, a social networking app, is not safe for children; for example, it found that 176 of 32,000 reviews since 2019 included reports of sexual abuse.
“There are a lot of sexual predators out here spamming people with links to join dating sites, as well as people named ‘Read my picture,'” says one review taken from the App Store. “It has a picture of a small child and says go to their site for child porn.”
Hoop, which is under new management, has a new content moderation system to strengthen user safety, said Liath Ariche, Hoop’s chief executive, adding that researchers have spotlighted how the originals struggled. founder to deal with bots and malicious users. “The situation has greatly improved,” the chief executive said.
The Meet Group, which owns MeetMe, said it did not condone the abuse or exploitation of minors and used artificial intelligence tools to spot predators and report them to law enforcement. . It reports inappropriate or suspicious activity to the authorities, including a 2019 episode where a man from Raleigh, NC, solicited child pornography.
Whisper did not respond to requests for comment.
Sgt. Sean Pierce, who heads the San Jose Police Department’s task force on internet crimes against children, said some app developers avoid investigating sextortion complaints to minimize their legal liability. The law says they don’t have to report criminal activity unless they see it, he said.
“The apps are more to blame than the app store because the apps are doing it,” said Sergeant Pierce, who offers presentations to San Jose schools through a program called the Vigilant Parent Initiative. Part of the challenge, he said, is that many apps connect strangers for anonymous conversations, making it difficult for law enforcement to verify.
Apple and Google make hundreds of reports annually to the US clearinghouse for child sexual abuse but don’t specify if any of those reports are related to apps.
Whisper was among the social media apps that Mr. Levine’s team found had many reviews mentioning sexual exploitation. After downloading the app, a high school student received a message in 2018 from a stranger who offered to contribute to a school robotics fund-raiser in exchange for a topless photo. After she sent the photo, the stranger threatened to send it to her family unless she provided more photos.
The teenager’s family reported the incident to local law enforcement, according to a report by the Mascoutah Police Department in Illinois, which later arrested a local man, Joshua Breckel. He was sentenced to 35 years in prison for extortion and child pornography. Although Whisper has not been found responsible, it has been named along with a half-dozen apps as the primary tool he used to collect images from victims who ranged in age from 10 to 15.
Chris Hoell, a former federal prosecutor in the Southern District of Illinois who worked on the Breckel case, said the App Danger Project’s comprehensive analysis of reviews can help parents protect their children from issues with apps like Whisper.
“It’s like an aggressively spreading, treatment-resistant tumor,” said Mr. Hoell, who now has a private practice in St. Louis. Louis. “We need more tools.”