‘s owner Meta has delayed rolling out encrypted messaging until at least 2023 after fears such a move could put children at greater risk of abuse.
Meta, which owns the social networking giant as well as the messaging service WhatApp, said it is taking time to ‘get this right’ and pledged to work to strike a balance between privacy and backlink indexing meaning safety online.
End-to-end encryption hides messages from everyone except those in a conversation, elasticsearch bulk index slow and has previously sparked warnings that it threatens children’s safety online.
WhatsApp already features full end-to-end encrpytion as a feature.
Antigone Davis, the boss of safety at Meta, said it will keep working with experts to tackle abuse, but insisted that in previous cases the firm was still in a position to help authorities despite services being encrypted.
Meta, which owns the social networking giant as well as the messaging service WhatApp, said it is taking time to ‘get this right’ and to strike a balance between privacy and safety
Writing in the Sunday Telegraph, she said: ‘Our recent review of some historic cases showed that we would still have been able to provide critical information to the authorities, even if those services had been end-to-end encrypted.
‘While no systems are perfect, this shows that we can continue to stop criminals and support law enforcement.
‘We’ll continue engaging with outside experts and developing effective solutions to combat abuse because our work in this area is never done.We’re taking our time to get this right and we don’t plan to finish the global rollout of end-to-end encryption by default across all our messaging services until sometime in 2023.’
Last year, the then children’s commissioner for England Anne Longfield said plans by social media firms for more encryption in messaging services would place children at greater risk by making it impossible for platforms to monitor content and prevent the police from gathering potentially vital evidence of child sexual exploitation.
Facebook’s plans for encryption have also been previously criticised by the Government, with Home Secretary Priti Patel warning it puts children at risk and offers a hiding place for abusers and other criminals.
Ms Davis said the firm is ‘determined to protect people’s private communications and keep people safe online’, but added that ‘people shouldn’t have to choose between privacy and safety’.
Home Secretary Priti Patel warned encryptuon puts children at risk and offers a hiding place for abusers and other criminals
She added that the firm is ‘building strong safety measures into our plans and engaging with privacy and safety experts, civil society and governments to make sure we get this right’.
The company set out its ‘three-pronged approach’ which it said involves preventing harm, giving people more control, and quickly responding if something happens.
It comes a day after the head of Ofcom reportedly called for social media companies to face sanctions if they do not prevent adults from directly messaging children.
The communications watchdog will regulate the sector under the Online Harms Bill and has the power to fine companies and block access to sites.
The Times reported Dame Melanie Dawes will encourage the regulator to closely examine direct messaging when the new regulations are introduced in 2023.
Speaking about the industry and the bill, Dame Melanie said: ‘I don’t think it’s sustainable for them to carry on as we are.Something’s got to change.
Meta’s app offering has been heavily criticised for its handling of children’s safety issues
‘What regulation offers them is a way to have consistency
The proposals in the Online Harms Bill include punishments for non-compliant firms such as large fines of up to £18 million or 10% of their global turnover — whichever is higher.
In August, Instagram announced it would require all users to provide their date of birth, while Google has introduced a raft of privacy changes for children who use its search engine and YouTube platform.
TikTok also began limiting the direct messaging abilities of accounts belonging to 16 and 17-year-olds, as well as offering advice to parents and caregivers on how to support teenagers when they sign up.
Andy Burrows, head of child safety online policy at the NSPCC, said: ‘Facebook is right not to proceed with end-to-end encryption until it has a proper plan to prevent child abuse going undetected on its platforms.
‘But they should only go ahead with these measures when they can demonstrate they have the technology in place that will ensure children will be at no greater risk of abuse.
‘More than 18 months after an NSPCC-led a global coalition of 130 child protection organisations raised the alarm over the danger of end-to-end encryption Facebook must now show they are serious about the child safety risks and not just playing for time while they weather difficult headlines.’