Thursday, March 28, 2024
-Advertisement-
Reimagining Public Sector Analytics
Reimagining Public Sector Analytics
HomeNewsIndustryFacing global criticism, Apple tells its ‘CSAM' detection tool will not breach privacy

Facing global criticism, Apple tells its ‘CSAM’ detection tool will not breach privacy

Follow Tech Observer on Google News

This comes after the tech giant faced global criticism over its iCloud Photos and messages questioning its child safety initiatives and use of government spyware

Google News

Step siding criticism, global technology major has expressed anguish over alleged child abuse material on iCloud while making it clear that it will not allow any government to conduct surveillance via the tool aimed at detecting and curbing child sexual abuse material () in iCloud photos.

This comes after the tech giant faced global criticism over its iCloud Photos and messages questioning its child safety initiatives and use of government spyware. Last week, the company confirmed plans to deploy new technology within iOS, macOS, watchOS, and iMessage that will detect potential child abuse imagery.

Apple said it will not accede to any government's request to expand the technology. “Apple will refuse any such demands. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” the company said.

According to Apple, its tech tool does not impact users who have not chosen to use iCloud Photos. “There is no impact to any other on-device data. This feature does not apply to Messages,” the company said. Earlier Epic Games CEO Tim Sweeney attacked Apple over its iCloud Photos over child safety initiatives.

“This is government spyware installed by Apple based on a presumption of guilt. Though Apple wrote the code, its function is to scan personal data and report it to the government,” Sweeney posted on Twitter.

Earlier WhatsApp Head Will Cathcart also slammed Apple over its plans to launch photo identification measures, saying the Apple software can scan all the private photos on your phone which is a clear privacy violation.

Stressing that WhatsApp will not allow such Apple tools to run on its platform, Cathcart said that Apple has long needed to do more to fight child sexual abuse material (CSAM), “but the approach they are taking introduces something very concerning into the world.”

Apple said that the ‘CSAM detection in iCloud Photos' tool is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images.

“This technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it,” Apple said.

Get the day's headlines from Tech Observer straight in your inbox

By subscribing you agree to our Privacy Policy, T&C and consent to receive newsletters and other important communications.
Tech Observer Desk
Tech Observer Desk
Tech Observer Desk at TechObserver.in is a team of technology reporters led by a senior editor who brings latest updates and developments from the world of technology.
- Advertisement -
Reimagining Public Sector Analytics
Reimagining Public Sector Analytics
- Advertisement -Veeam
- Advertisement -Reimagining Public Sector Analytics
- Advertisement -ESDS SAP Hana

Subscribe to our Newsletter

83000+ Industry Leaders read it everyday

By subscribing you agree to our Privacy Policy, T&C and consent to receive newsletters and other important communications.
- Advertisement -

How AI power struggle opens up new frontier in global geopolitics

The far-reaching influence AI has on information processing, national security, military operations, the economy, and strategic decision-making is set to reshape the geopolitical landscape and redefine the power dynamics between nations.

RELATED ARTICLES