Advertisement

Apple to scan iPhones, other devices for child sex abuse images

Published: Aug. 20, 2021 at 6:28 PM EDT
Email This Link
Share on Pinterest
Share on LinkedIn

CLEVELAND, Ohio (WOIO) - It’s a new way to detect child pornography on cell phones.

Apple plans to scan iPhones and other devices for images of child sexual abuse.

Child advocates are praising the idea, but some researchers worry the system could be abused.

A new tool from Apple called neuralMatch aims to fight child predators.

It detects known images of child sexual abuse with a scan of devices before they’re uploaded to iCloud from iPhones, iPads and other devices.

If the program finds a match, the picture will be reviewed by a person.

And if they confirm it is child pornography, the user’s account will be disabled.

Then they will call police and the National Center for Missing and Exploited Children.

Frank Bajak covers cybersecurity for the Associated Press.

“They say, without defeating encryption – which Apple’s technology is prized for, that privacy element – they say that they can identify using a digital fingerprint, if you will, any of these known images of child sexual abuse,” Bajak said.

Apple also plans to scan users’ encrypted messages for sexually explicit content as a child safety measure.

Tech companies like Google and Facebook have been sharing “digital fingerprints” of photos like this for years with law enforcement.

Here in Ohio, about a thousand cyber tips a month are coming into the Ohio Internet Crimes Against Children task force, Ohio ICAC.

“I think privacy goes out the window when you start sharing this material online with other users, when you start publicly advertising it out there. But at the end of the day, we have a duty to investigate these cases and these companies have a duty to make sure they’re protecting their interests as well. So I would certainly applaud Apple from the standpoint that they’re taking a proactive step to safeguard children,” said Ohio ICAC statewide commander Dave Frattare.

Apple’s neuralMatch will only flag images already in its database of known child pornography.

So if you’re a parent snapping a pic of your kid in the bathtub, you should be fine.

Some cybersecurity experts and civil liberty organizations are against the detection tool.

They worry this technology threatens privacy degrades privacy for Apple users.

The tech company plans to roll out updates to their devices later this year.

You can read about Apple’s expanded protections for children here.

Copyright 2021 WOIO. All rights reserved.