A group of researchers have said that Apple’s dangerous photo analysis technology is invasive and ineffective in detecting images of child sexual abuse

More than a dozen prominent cybersecurity experts have accused Apple of relying on dangerous technology in its controversial plan to detect child sexual abuse images on iPhone

In August, Apple announced the arrival of new photo tagging features on iOS that will use hashing algorithms to match content in users’ gallery photos with known items of child abuse, such as child pornography. The iPhone will download a set of fingerprints representing the illegal content, and then compare each photo in the user’s gallery to that list.

Apple said the scanning technology is part of a new series of child protection systems that will evolve and develop over time.

The system, called neuralMatch, will proactively alert a team of human examiners if it thinks illegal images are detected. According to a Financial Times article, which first reported the news, human examiners will then contact law enforcement if the material can be verified. The neuralMatch system, which was trained using 200,000 images from the National Center for Missing & Exploited Children, will be deployed first in the United States and then to the rest of the world. The photos will be hatched and compared to a database of known images of child sexual abuse.

According to Cupertino’s explanation, each photo uploaded to iCloud in the United States will receive a security voucher indicating whether it is suspicious or not. Thus, once a certain number of photos are marked as suspicious, Apple will decrypt all suspicious photos and, if they appear to be illegal, forward them to the competent authorities. Apple only sees user photos if they have a collection of known CSAMs in their iCloud Photos account, the company said in an attempt to reassure users that their data is confidential.

An initiative that created the division, even among Apple employees. Matthew Green, Johns Hopkins University professor and cryptographer, said on Twitter: This kind of tool can be a godsend for finding pornography on people’s phones … But imagine what it could do in your hands. of an authoritarian government.

In addition, according to the researchers, although the system is currently trained to track child sexual abuse, it could be improved to detect any other image, for example, beheadings of terrorists or anti-government signs during protests. But the dangers are not limited there and could reach other platforms.

The precedent created by Apple could also put more pressure on other tech companies to use similar techniques. Governments will demand it of all, Green worried. Alec Muffett, a security researcher and privacy activist who worked at Facebook and Deliveroo, said Apple’s decision was “tectonic” and a “huge and regressive step for privacy.” Apple rolls back privacy to allow 1984 [NDLR, le plus clbre roman dystopique de George Orwell, publi en 1949] , he said.

Initially, the features were to be deployed as part of iOS 15, which was released in September. This innovative new technology enables Apple to provide valuable and actionable information to the National Center for Missing and Exploited Children and law enforcement regarding the proliferation of CSAM [child sexual abuse material] known, the company said. But the public outcry has forced Apple to change its roadmap.

Apple tried unsuccessfully to reassure

In an internal memo intended for the teams who worked on this project, Apple acknowledged the misunderstandings around the features, while saying that these features are part of an important mission to keep children safe. The document, which was also released on the same day as the new features, was written by Sbastien Marineau-Mes, a software VP at Apple. Marineau-Mes says Apple will continue to explain and detail the features included in this suite of extended protection for children. Here is the memo in its entirety:

Today marks the official public unveiling of the expanded protections for children, and I wanted to take a moment to thank each of you for your hard work over the past few years. We would not have reached this important milestone without your tireless dedication and resilience. Keeping children safe is such an important mission. Like Apple, the pursuit of this goal required a deep, cross-functional commitment, encompassing engineering, general administration, IT, legal services, product marketing and public relations.

What we announced today is the result of this collaboration, a product that provides tools to protect children, while respecting Apple’s deep commitment to user privacy. We have seen a lot of positive reactions today. We know some people have misunderstandings and more than one worries about the implications, but we will continue to explain and detail the features so that people understand what we have built.

And while there is still a lot of work to be done to implement the features in the coming months, I wanted to share this note we received today from NCMEC. I found it incredibly motivating, and I hope you will be too. I am proud to work at Apple with such a great team. Thanks everyone ! .

The mmo also included a message from the National Center for Missing and Exploited Children, signed by Marita Rodriguez, executive director of strategic partnerships. Apple is working closely with NCMEC on new iCloud scanning features. Here is the full text of the NCMEC note sent to the Apple team working on these features:

Apple Team, I wanted to share a word of encouragement to say that everyone at NCMEC is SO PROUD of each of you and the amazing decisions you have made in the name of putting the protection of children first. It was invigorating for our entire team to see (and play a small part in) what you saw today. I know it’s been a long day and many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the shrill voices of the minority. Our voices will be louder.

Our commitment to supporting children who have experienced the most unimaginable abuse and victimization will be stronger. During these long days and sleepless nights, I hope you will find comfort in the fact that, thanks to you, thousands of child victims of sexual exploitation will be saved and will have a chance to heal and live through childhood. that they deserve. Thank you for finding the way forward to protect children while preserving their privacy! .

Cyber ​​security researchers are stepping up to the plate

In a 46-page document released Thursday, a group of researchers said the dangerous technology was invasive and ineffective at detecting images of child sexual abuse.

Cyber ​​security researchers said they began their study before Apple’s announcement. Documents released by the European Union and a meeting with EU officials last year led them to believe that the bloc’s governing body wanted a similar program that would not only analyze images of child sexual abuse. , but also signs of organized crime and indications of terrorist links.

A proposal to allow the digitization of photos in the European Union could arrive as soon as this year, say the researchers.

They said they are releasing their findings now to inform the European Union of the dangers of its plan, and because the expansion of state surveillance powers is really crossing a red line, said Ross Anderson, professor of Cambridge University security engineer and member of the group.

Surveillance issues aside, the researchers said, their results indicated that the technology was not effective in identifying images of child sexual abuse. Within days of Apple’s announcement, they said, people pointed out ways to avoid detection by modifying the footage slightly.

The technology allows a personal private device to be scanned without any probable cause for anything wrong to be done, added another member of the group, Susan Landau, professor of cybersecurity and politics at Tufts University. It is extraordinarily dangerous. It is dangerous for business, national security, public safety and privacy.

Sources: Researchers report, EU

And you ?

What do you think of the researchers’ conclusions?
What do you think of the warnings to the EU on a plan to rely not only on analysis of images to detect possible child sexual abuse, but also signs of organized crime and signs of terrorist links? ?

For Latest Updates Follow us on Google News

NEXT Omega 3 Pufa Market Huge Demand, In-Depth Analysis, and Estimated Forecast to 2028