Facebook reveals 8.7M posts removed over child exploitation links
By Martyn Landi, Press Association Technology Correspondent
Facebook has removed more than eight million pieces of content linked to child exploitation in the last three months, the company revealed.
The social network's global head of safety Antigone Davis said 8.7 million pieces of content that violated the site's rules on child nudity and child exploitation had been removed from the site.
The company said 99% of this content was removed before being reported.
Ms Davis used a blog post to discuss how the firm is using technology to prevent such content being seen by users, including how it uses artificial intelligence to identify offensive content.
She said the company also used photo-matching technology as well as specialist human reviewers to find, identify and remove content that breached its community standards.
She added that Facebook also works with organisations such as the US-based National Centre for Missing and Exploited Children (NCMEC), reporting content to them.
"Our Community Standards ban child exploitation and to avoid even the potential for abuse, we take action on nonsexual content as well, like seemingly benign photos of children in the bath," she said.
"With this comprehensive approach, in the last quarter alone, we removed 8.7 million pieces of content on Facebook that violated our child nudity or sexual exploitation of children policies, 99% of which was removed before anyone reported it. We also remove accounts that promote this type of content.
"We have specially trained teams with backgrounds in law enforcement, online safety, analytics, and forensic investigations, which review content and report findings to NCMEC."
She said Facebook was helping the organisation develop software to help prioritise the reports it receives and shares with law enforcement in order to address the most serious cases first.
"We also collaborate with other safety experts, NGOs and companies to disrupt and prevent the sexual exploitation of children across online technologies.
"For example, we work with the Tech Coalition to eradicate online child exploitation, the Internet Watch Foundation, and the multi-stakeholder WePROTECT Global Alliance to End Child Exploitation Online.
And next month, Facebook will join Microsoft and other industry partners to begin building tools for smaller companies to prevent the grooming of children online."
The social network has been on a recent public drive to promote tools it is introducing to better police the platform.
A verification system has been introduced to the UK for those wishing to post political adverts on the site, and it recently revealed it has created a physical "war room" at its California headquarters to monitor attempts to interfere in the upcoming US mid-term elections. (PA)