Coincidences matter. Coincidences do occur. And, some coincidences happened recently. Just a few days before, the Cabinet decided to extend the timeline of POCSO fast track courts. And, now Apple has decided to bring more compliance in child sex abuse regulation. Child sex abuse is one of the burning topics which has increased in current times. Just file an RTI application from the Ministry of Women & Child Development or check the National Criminal Records Bureau portal to know how many child sex abuse cases get registered in a year. Figures are disturbing. But, with this update, Apple is displaying its firm intent to mitigate such instances. Let’s know more about this news in detail from this Exhibit blog.
Apple’s Tool for Child Sex Abuse
A report is out where there is information available that Apple is working to develop a tool that can scan child sex abuse images. This report thoroughly talks about how Apple is working towards child safety and other measures that can reduce the frequency in such cases. Apple will bring this tool with its flagship OS-based products, including iOS 15, watchOS 8, and macOS Monterey. Apple has also announced that they are enhancing the utility of their default message application to tackle this issue.
How will this tool work?
If it’s about the Message app, the essential element that will help to scan such incidents through rich libraries is NLP (Natural Language Processing). Whenever any message comes, it will try to detect keywords like ‘child,’ ‘sex,’ or something quite relevant to this domain and understand what the sender or receiver tries to express. Once understood, it will send the notification for the same to the National Centre for Missing and Exploited Children, a regulatory agency in the USA. This apple tool will also scan images and comparing same with the large data set on child sex abuse. If your child thinks about seeing sensitive images, this tool has the power to send notifications to you.
Potential of Apple’s Tool
The new Apple tool has immense potential to reduce child sex instances to a greater extent. Apple should think about collaborating with other social media messaging apps to read messages or request those app-based companies to send a red alert by using the tool without storing it somewhere (hence, it will keep privacy intact). Even in India, both MoWCD and MHA should talk about this tool that can help their law enforcement agencies catch culprits.
Apple has taken a giant step to bring betterment to society. It will be great to see other smartphone makers take the same type of measures by building such tools to make humanity feel safer, especially for your kids.