Apple will use technology to look for sexual abuse images on iPhones

Apple has unveiled plans to examine iPhones in the US for images of child sexual abuse, which has elicited applause from child abusers and concerns about possible use by governments seeking to control their citizens.

Apple said its messaging application will now have the ability to identify and warn about sensitive content, without allowing the company to read private communications.

The instrument, which Apple has termed ‘neuralMatch’, will detect known images of child sexual abuse without decrypting people’s messages. If a match is found, the image will be seen by a person, who can notify the police if necessary.

But investigators countered that the instrument could be used for other purposes, such as government surveillance of dissidents or challengers.

Matthew Green of Johns Hopkins University, one of the leading researchers in cryptography, has expressed concern about the possibility of using this tool to control innocent people, sending them images that are trivial but designed to look like child pornography, tricking the Apple algorithm leading alerting police forces – essentially controlling people. “Researchers have been able to do this easily,” he said.

Tech companies such as Microsoft, Google and Facebook have for years shared ‘blacklists’ of known images of child sexual abuse. Apple has also been inspecting files stored on its iCloud service, which isn’t as securely encrypted as your messages, looking for such images.

The company has been under pressure from governments and police to authorize surveillance of encrypted information.

Moving forward with these security measures will require Apple to strike a delicate balance between attacking the exploitation of children and maintaining its commitment to protecting the privacy of its users.

Apple believes it will achieve this compatibility with the technology it has developed in consultation with several prominent cryptographers, such as Dan Boneh, a professor at Stanford University, whose work in this field earned him a Turing Award, often considered the version of the Nobel Prize in the field. of technology.

The computer scientist, who more than a decade ago invented PhotoDNA, the technology used by police to identify child pornography online, recognized the potential for abuse of Apple’s system, but countered that it is more than offset by the imperative to fight the sexual abuse of children.

“It’s possible? Certainly. But is it something that worries me? No,” said Hany Farid, a researcher at the University of California, Berkeley, who argues that there are many other programs designed to protect devices from various threats. For example, WhatsApp provides its users with full encryption to protect their privacy, but uses a malicious program detection system and warns users not to open suspicious links.

Apple was one of the first major companies to embrace end-to-end encryption, in which messages can only be read by senders and recipients. However, the police have been pressing for a long time to give in to this information, in order to investigate crimes such as terrorism or the sexual exploitation of children.

Apple’s increased protection of children is a game changer,” President John Clark, National Center for Missing and Abused Children, said in a statement. “With so many people using Apple products, these new security measures have the potential to save children’s lives”, accentuated.

Thorn President Julia Cordua considered that Apple’s technology weighs “the need for privacy with the digital safety of children.” Thorn, a non-profit organization founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse.


Disclaimer: If you need to update/edit/remove this news or article then please contact our support team Learn more

Leave a Reply

Your email address will not be published.