Apple Appeals Against Security Research Firm While Touting Researchers

Apple on Tuesday appealed a copyright case it lost against security startup Corellium, which helps researchers examine programs like Apple’s planned new method for detecting child sex abuse images.

A federal judge last year rejected Apple’s copyright claims against Corellium, which makes a simulated iPhone that researchers use to examine how the tightly restricted devices function.

Security experts are among Corellium’s core customers, and the flaws they uncovered have been reported to Apple for cash bounties and used elsewhere, including by the FBI in cracking the phone of a mass shooter who killed several people in San Bernardino, California.

Apple makes its software hard to examine, and the specialised research phones it offers to pre-selected experts come with a host of restrictions. The company declined to comment.

The appeal came as a surprise because Apple had just settled other claims with Corellium relating to the Digitial Milennium Copyright Act, avoiding a trial.

Experts said they were also surprised that Apple revived a fight against a major research tool provider just after arguing that researchers would provide a check on its controversial plan to scan customer devices.

“Enough is enough,” said Corellium Chief Executive Amanda Gorton. “Apple can’t pretend to hold itself accountable to the security research community while simultaneously trying to make that research illegal.”

Under Apple’s plan announced earlier this month, software will automatically check photos slated for upload from phones or computers to iCloud online storage to see if they match digital identifiers of known child abuse images. If enough matches are found, Apple employees will look to make sure the images are illegal, then cancel the account and refer the user to law enforcement.

“We’ll prevent abuse of these child safety mechanisms by relying on people bypassing our copy protection mechanisms,’ is a pretty internally incoherent argument,” tweeted David Thiel of the Stanford Internet Observatory.

Because Apple has marketed itself as devoted to user privacy and other companies only scan content after it is stored online or shared, digital rights groups have objected to the plan.

One of their main arguments has been that governments theoretically could force Apple to scan for prohibited political material as well, or to target a single user.

In defending the program, Apple executives said researchers could verify the list of banned images and examine what data was sent to the company in order to keep it honest about what it was seeking and from whom.

One executive said that such reviews made it better for privacy overall than would have been possible if the scanning occurred in Apple’s storage, where it keep the coding secret.

© Thomson Reuters 2021


For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechNewsBoy.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.