UK tries to stop Facebook’s end-to-end coding

The United Kingdom is planning a new attack on end-to-end encryption, with the Home Office spearheading efforts to discourage Facebook from further expanding the technology to its messaging applications.

Home Secretary Priti Patel plans to deliver a speech at a child protection charity aimed at exposing the alleged ills of end-to-end coding and tightening regulation of technology to ask. At the same time, a new report will say that technology companies need to do more to protect children online.

According to a draft invitation seen by WIRED, Patel will be organized at a round table on April 19 by the National Society for the Prevention of Cruelty to Children (NSPCC). The event is very critical of the coding standard, making it harder for investigators and technology companies to monitor communication between people and to detect childcare or illegal content, including footage or child abuse.

End-to-end encryption works by securing communication between those involved – only the sender and recipient of messages can see what they are saying and platforms that provide the technology do not have access to the content of messages. The technology has become increasingly standard in recent years with WhatsApp and Signal using standard encryption to protect people’s privacy.

The Home Office move comes as Facebook plans to implement end-to-end coding across all of its messaging platforms – including Messenger and Instagram – which has sparked a heated debate in the UK and elsewhere over the alleged risks what the technology holds for children.

During the event, the NSPCC will unveil a report on end-to-end encryption by PA Consulting, a UK firm that has notified the UK’s Department of Digital Cultural Media and Sport (DCMS) of the upcoming online security regulation. An early draft of the report, seen by WIRED, says that the increasing use of end-to-end encryption will protect the privacy of adults at the expense of children’s safety, and that any strategy that technology companies use will have the effect of the encryption will “almost certainly be less effective than the current ability to search for harmful content.”

The report also suggests that the government devise regulations that are “explicitly aimed at encryption” to prevent technology companies from ‘engineering'[ing] away ”their ability to police illegal communications. It recommends that the forthcoming online safety bill, which will impose duties of care on online platforms, make it mandatory for technology companies to share information about online child abuse, as opposed to voluntary.

The online security bill is expected to require companies whose services use end-to-end encryption to demonstrate how effectively they address the distribution of harmful content on their platforms – or run the risk of being fined by communications agency Ofcom, who will be in charge of the application of the rules. As a last resort, Ofcom may demand that a company use automated systems to wipe out illegal content from their services.

The NSPCC says this setup does not go far enough in encryption: in a statement issued last week, the charity urged digital secretary Oliver Dowden to strengthen the proposed regulation, which prevents platforms from end-to-end end roll out. stop coding until they can show that they can protect the safety of children. Facebook is currently tackling the spread of sexual abuse of content on WhatsApp by removing accounts that display prohibited images in their profile photos, or groups whose names indicate an illegal activity. WhatsApp says it bans more than 300,000 accounts a month from suspecting they have shared sexual abuse of children.

“Ofcom will have to pass a series of tests before it can operate on a regulated platform,” said Andy Burrows, head of NSPCC’s online security policy. “It’s about the fact that we may need evidence of serious and persistent abuse, which in practice will be very difficult due to end-to-end coding, will take away a significant amount of the flow of reports.”

.Source