Whatsapp Fires Back At Apple's Child Safety Plan | Exhibit Tech
Tech News

Whatsapp Fires Back At Apple’s Child Safety Plan

apple child safety

Whatsapp’s head Will Cathcart said the company wouldn’t adopt Apple’s new Child Safety measures, meant to stop the spread of child abuse imagery. 

This comes a day after Apple confirmed plans for new software that will detect child abuse images on user’s iCloud photos. 

Whatsapp’s head says that he is “concerned” by the plans. In a Twitter thread, Will Cathcart called it an “Apple built and operated surveillance system that can be easily used to scan private content for anything they or the government decides it wants to control. He said Apple had taken the wrong path to improve its response to child sexual abuse material. 

A spokesperson for Apple has disputed these claims made by Cathcart and said that users could choose to disable iCloud Photos. Apple further noted that the system is only trained on a “known images” database provided by the National Center for Missing and Exploited Children(NCMEC). It isn’t possible to make it work in a specific region since it is backed by iOS. 

Cathcart said that countries where iPhones are sold, would have different definitions of what is acceptable.

“Will the system be used in China?What content will they consider illegal there and how will we ever know? How will they manage requests from governments all around the world to add other types of content to the list for scanning?” asked Cathcart.

It is not surprising that Facebook has issues with Apple’s plans. Apple has previously, on different platforms, bashed Facebook over its record on Privacy, even though the social network has embraced end-to-end encryption. 

It is not just Whatsapp that has criticized Apple’s new Child Safety measure. Other people who have raised concerns include Edward Snowden, the Electronic Frontier Foundation, professors, and more. 

How Will The New Apple Tool Work?

The new tool is meant to enable parents to play a more active role in helping their children to navigate online communication. Through the software, Messages will use machine learning to analyze image attachments and determine if the content being shared is sexually explicit. 

This technology will not require Apple to access your child’s private communications since nothing is passed back to Apple’s server in the cloud. 

If a sensitive photo is discovered in the texts, it will be blocked, and a label will appear below the photo stating, “this may be sensitive.”

Read more about Apple’s New Child Safety Software here.

 


Related posts
Tech News

OnePlus 13 Set to Introduce iPhone-like MagSafe Charging

Tech News

Google to Introduce New Theft Protection Features for Android Devices

Tech News

MediaTek Dimensity 9400 Brings On-device AI And Tri-fold Phone Support

Tech News

Tata Technologies and BMW Group Introduce BW Techworks India