Third of child grooming cases involve Facebook apps says UK charity
by Alice TideyAt least a third of child grooming offences in England and Wales were carried out over Facebook-owned applications, according to figures released by a child welfare charity.
The National Society for the Prevention of Cruelty to Children (NSPCC) obtained the data via freedom of information (FOI) requests.
It found that 10,119 online grooming crimes were recorded by police in England and Wales in the two and a half years since a law made it illegal for adults to send sexual messages to children.
55% of cases
Police recorded information about how the child was groomed in just over half of those offences — 5,784 — and Facebook-owned applications including Facebook, Facebook Messenger, Instagram and WhatsApp were used in 55% of cases.
Half of the 3,200 instances of Facebook-owned apps being used involved Instagram. The privately-owned social media platform Snapchat was used over 1,060 times.
The NSPCC also flagged that the number of offences is accelerating with nearly a quarter of the cases taking place in the six months up to October 2019 and warned that there could be "a sharper increase this year" due to the coronavirus-induced lockdowns and "industry failure to design basic child protection into platforms".
The charity called on the British government to deliver the Online Harms Bill within 18 months.
"In February, Digital Minister Matt Warman promised to publish an Online Harms Bill following proposals set out in a white paper. These proposals set out independent regulation of social networks with potential criminal sanctions if tech directors fail to keep children safe on their platforms," it noted.
"However, frustration is growing at delays to the legislation not now expected until the end of the year and concerns we might not see a regulator until 2023," the NSPCC added.
The latest child grooming data comes at a difficult time for Facebook which has plans to implement end-to-end encryption across all its messaging platforms arguing it will reinforce users' privacy.
End-to-end encryption means only the sender and recipient of a communication can decrypt it, shutting down access to any third party including the messaging service or law enforcement. So far, WhatsApp is the only Facebook-owned application to have such encryption.
Opposition mounting
Governments and activist shareholders have opposed the move warning it might make it impossible to detect child exploitation cases.
Proxy Impact, a shareholder advocacy group promoting sustainable and responsible business practice, outlined its opposition ahead of Facebook's annual shareholder's meeting held on Thursday. It said that encryption "will provide child predators cover that will exponentially expand their outreach and the number of victims".
"The information and communications technology is the world's main facilitator of child sexual exploitation. Facebook is the world's largest social media company with 2.45 billion active monthly users. It is not unreasonable to expect a $70 billion company to help solve a problem that it has helped create — and one that Facebook and the tech industry are about to make much worse," it added.
The governments of the UK, Australia and the US have also voiced concerns, calling on Facebook in October 2019 to either abandon its plans for end-to-end encryption or provide "law enforcement court-authorised access to the content of communications to protect the public, particularly child users".
"Facebook's proposals would put at risk its own vital work that keeps children safe. In 2018, Facebook made 16.8 million reports of child sexual exploitation and abuse content to the National Center for Missing and Exploited Children, 12 million of which it is estimated would be lost if the company pursued its plan to implement end-to-end encryption," their joint letter said.
But Facebook responded at the time that it strongly opposes building so-called back doors into its encryption because it would "undermine the privacy and security of people everywhere."
The company argues that that other technology, including PhotoDNA, makes it possible for the company to find child exploitation content.