Thursday, September 9, 2021

Priti Patel says Apple should see through CSAM photo scanning measures

Apple state their child sexual abuse filtering technology has a false positive rate of 1 in a trillion, meaning the privacy of legitimate users is protected whilst those building huge collections of extreme child sexual abuse material are caught out. They need to see through that project."

What you need to know

  • Apple recently pumped the brakes on its Child Safety Measures.
  • UK Home Secretary Priti Patel has called on the company to see out the project.
  • It comes as the UK launches a new Safety Tech Challenge Fund.

UK Home Secretary Priti Patel has called on Apple to see through its Child Safety project, including a measure that would scan iCloud content for known Child Sexual Abuse Material.

It comes as the Home Office launched a new Safety Tech Challenge Fund. Patel stated:

It is utterly appalling to know that the sexual abuse of children is incited, organised, and celebrated online. Child abusers share photos and videos of their abhorrent crimes, as well as luring children they find online into sending indecent images of themselves.

It is devastating for those it hurts and happens on a vast and growing scale. Last year, global technology companies identified and reported 21 million instances of child sexual abuse.

The Home Secretary says it is a misconception that abuse like this takes place in the "dark corners of the web", and said that end-to-end encryption presented a massive challenge for public safety:

End-to-end encrypted messaging presents a big challenge to public safety, and this is not just a matter for governments and law enforcement. Social media companies need to understand they share responsibility for keeping people safe. They cannot be passive or indifferent about what their products enable or how they might inadvertently blind themselves and law enforcement from protecting children with end-to-end encryption.

Patel also praised Apple's recent CSAM measures, which the company has put on hold following feedback from experts and privacy advocates. She called on the company "to see through that project", stating the tech's 1 in a trillion false positive rate meant "the privacy of legitimate users is protected whilst those building huge collections of extreme child sexual abuse material are caught out."

Apple's CSAM scanning measures have drawn criticism from some because it takes place on-device, which some people consider to be intrusive, as noted Apple has now put the plans on hold.


0 comments:

Post a Comment