Big brother? A new Apple update is raising privacy concerns

Do you own an Apple iPhone or iPad? If so, you are going to want to take a look at this.

Breitbart reports that an upcoming update for Apple’s iPhones and iPads is going to contain a new feature that is raising privacy concerns. 

What to know:

What the new feature will do is allow Apple to scan personal photos that the iPhone and/or iPad user has stored in Apple’s iCloud service.

The purpose of this scan is to combat child pornography. If Apple were to find sexually explicit images of children – Child Sexual Abuse Material (CSAM) – on a user’s iPhone or iPad, then the company would report that information to the National Center for Missing and Exploited Children (NCMEC), which collaborates with law enforcement agencies.

This feature will be included in Apple’s iOS 15 and iPadOS 15 firmware update.

The backlash

Many experts, according to Breitbart, are now raising privacy concerns about Apple’s upcoming update. Among them is Edward Snowden, the National Security Agency (NSA) whistleblower.

Snowden tweeted: “No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.”

Similarly, the Electronic Frontier Foundation wrote:

Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor. We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of ‘terrorist’ content that companies can contribute to and access for the purpose of banning such content.

Apple’s response

The company, following the backlash, has now been on damage control, working to assure its users that it will maintain user privacy.

“Let us be clear, this technology is limited to detecting CSAM [child sexual abuse material] stored in iCloud and we will not accede to any government’s request to expand it,” Apple said. It claimed that it has a proven track record of preventing this sort of thing from happening.

The company added that the system “only works with CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations.”

Many, nonetheless, remain skeptical. We’ll have to see whether this move encourages some people to move away from Apple phones and tablets.

Share on facebook
Share To Facebook

Welcome to our comments section. We want to hear from you!

Any comments with profanity, advocacy of violence, harassment, personally identifiable information or other violations will be removed. If you feel your comment has been removed in error please contact us!

Latest Posts