At the Cost of Privacy Apple’s New Software has the Potential to Combat Abuse.

By Esther Fultz

This August, Apple announced plans to release a tool called NeuralMatch to scan photos being uploaded to iCloud and compare them to a database of known child abuse images.  Since then, a great deal of controversy and privacy concerns have arisen around the software, leading Apple to postpone its release and spend more time developing the tool.

Jeffrey Simon, Associate Professor of Communication, said, “Trying to protect children and prevent underage explicit content is a great idea but the methodology of what they’re trying to implement goes against a lot of their core privacy and security message.”

Simon explains that some of Apple’s statements regarding the privacy of the software seem to contradict one another.  “While they make the statement that there’s anonymity, they also make the statement they can review [questionable photos] and take them to court to take further action,” said Simon. “That totally says that it’s not anonymous. There’s some very unclear things that they’re doing.”

In addition to privacy concerns is the question of whether the software would actually be effective.  

George Huff, a Senior Professor of Social Work who spent 25 years practicing in child protective services, said, “My first thought is that if someone who would be creating Child Sexual Abuse Material (CSAM) knew this software would be on their phone, they just wouldn’t use that phone.  They would find another way to create those images.”

Simon also expressed concerns about effectiveness, pointing out the software’s focus on specific apps and its inability to scan the device as a whole.  Simon said he doesn’t agree with Apple’s decision to focus solely on iCloud and iMessages and believes a more comprehensive service is needed to truly address the problem.

Additionally, Simon believes frequent imagery screening, modeled off Facebook’s comparative software, is key to the effectiveness of the Apple software.

Regardless of its effectiveness or lack thereof, Huff and Simon agreed using the software in its present condition could result in unintended consequences.  

Huff expressed concern about the individuals who could be manually viewing imagery that the software flags, and what could result if these individuals were not properly trained for this work.  Up to this point, Apple has been unclear on whether individuals manually reviewing software will be experienced in the field of child sexual abuse.  The company is also unclear on what specifically qualifies as child sexual abuse material and who is creating that definition.

Huff also explained the software could potentially be more harmful to children than it is helpful. 

“If the images were to the level that people had to look at them and an employee was to accidentally send them to someone they weren’t supposed to, that would be problematic,” Huff said.

Rather than using technology to address the topic, Huff believes the most effective method of preventing child abuse is having professionals educate the general public on the topic.  

“I think the answer is for professionals trained in the area of child abuse to educate individuals children come in contact with often – doctors, nurses, teachers, childcare providers – about the warning signs of sexual abuse and how to properly report it,” Huff said.

Simon also pointed to education as a solution, including education for creators of CSAM who might be struggling to overcome these destructive behaviors.  Simon added education on cyber safety should be targeted towards parents as well as children. 

“I think we focus on kids a lot, teaching them how to be safe on the internet, and that’s good,” Simon said.  “But I think the parents need constant training, too, on how they can be good gatekeepers for the children and teens in their life.”

“I think any campaign that focuses on stopping the sharing of content like that should also support ways you can get help for that,” Simon added.

As a community, we should strive to create a safer online community for children.  Apple’s software strives to do just that.  But we should make sure our efforts protect the privacy of the individual as well – something Apple may need to rethink before releasing any child protection software.

Esther Fultz is a sophomore Social Work major and an Off-Campus and On-Campus writer for Cedars.  She enjoys writing songs, spending time outdoors, drinking coffee, and hanging with friends.

1 Reply to "At the Cost of Privacy Apple’s New Software has the Potential to Combat Abuse."

  • comment-avatar
    Tom October 29, 2021 (5:22 pm)

    Sounds like an easy way for our government overlords to surveil us even more. Never give them more than they already have. And then there’s the inevitable “hack” by some nefarious group.

Leave a reply

Your email address will not be published.