Apple confirms it will start scanning iCloud and Messages to detect child abuse images

Apple will begin using a system that will detect sexually explicit photos in Messages, Photos and iCloud, comparing it against a database of known Child Sexual Abuse Material (CSAM), to help point law enforcement to potential predators. 

The announcement (via Reuters) says these new child safety measures will go into place with the release of iOS 15, watchOS 8 and macOS Monterey later this year. This move comes years after Google, Facebook and Microsoft put similar systems in place. Google implemented a “PhotoDNA” system back in 2008 with Microsoft following suit in 2009. Facebook and Twitter have had similar systems in place since 2011 and 2013 respectively.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechNewsBoy.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.