Apple has removed all reference to its controversial child sexual abuse material (CSAM) detection feature from its child safety webpage.
Announced in August, the CSAM feature intended to protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material.
It was part of the features including scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos and expanded CSAM guidance in Siri and Search.
Two of the three safety features, which were released earlier this week with iOS 15.2, are still present on the page, which is titled "Expanded Protections for Children".
However references to the CSAM detection, whose launch was delayed following backlash from non-profit and advocacy groups, researchers and others, have been removed, reports MacRumors.
The tech giant, however, said its position hasn't changed since September, when it first announced it would be delaying the launch of the CSAM detection.
"Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," Apple had said in September.
Following the announcement, the features were criticised by a wide range of individuals and organisations, including security researchers, the privacy whistleblower Edward Snowden, Facebook's former security chief, politicians, etc.
Apple endeavoured to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more.
According to reports, an upcoming Apple iOS update will allow parents to protect their children and help them learn to navigate online communication in Messages.
The second developer beta of iOS 15 (iOS 15.2) includes support for its new communication safety feature in Messages.
With this update, Apple Messages will be able to use on-device machine learning to analyse image attachments and determine if a photo being shared is sexually explicit, TechCrunch had reported.
--IANS
na/ksk/
(Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)
You’ve reached your limit of {{free_limit}} free articles this month.
Subscribe now for unlimited access.
Already subscribed? Log in
Subscribe to read the full story →
Smart Quarterly
₹900
3 Months
₹300/Month
Smart Essential
₹2,700
1 Year
₹225/Month
Super Saver
₹3,900
2 Years
₹162/Month
Renews automatically, cancel anytime
Here’s what’s included in our digital subscription plans
Exclusive premium stories online
Over 30 premium stories daily, handpicked by our editors


Complimentary Access to The New York Times
News, Games, Cooking, Audio, Wirecutter & The Athletic
Business Standard Epaper
Digital replica of our daily newspaper — with options to read, save, and share


Curated Newsletters
Insights on markets, finance, politics, tech, and more delivered to your inbox
Market Analysis & Investment Insights
In-depth market analysis & insights with access to The Smart Investor


Archives
Repository of articles and publications dating back to 1997
Ad-free Reading
Uninterrupted reading experience with no advertisements


Seamless Access Across All Devices
Access Business Standard across devices — mobile, tablet, or PC, via web or app
)