Facebook has published its latest “enforcement report”, which details how many posts and accounts it took action on between October 2018 and March 2019.
During that six-month period, Facebook removed more than three billion fake accounts – more than ever before.
More than seven million “hate speech” posts were removed, also a record high.
For the first time, Facebook also reported how many deleted posts were appealed, and how many were put back online after review.
Facebook said the rise in the number of deleted fake accounts was because “bad actors” were using automated methods to create large numbers of them.
But it said it spotted and deleted a majority of them within minutes, before they had any opportunity to “cause harm”.
The social network will now also report how many posts were removed for selling “regulated goods” such as drugs and guns.
It said it took action on more than one million posts selling guns in the six-month period covered by the report.
For some types of content, such as child sex abuse imagery, violence and terrorist propaganda, the report estimates how often such content was actually seen by people on Facebook.
The report said that out of every 10,000 pieces of content viewed on Facebook:
* fewer than 14 people saw nudity
* about 25 people saw violence or graphic content
* fewer than three people saw child abuse imagery or terrorist propaganda
Overall, about 5% of the monthly active users on Facebook were fake accounts.
For the first time, the report reveals that between January and March 2019 more than one million appeals were made after posts were deleted for “hate speech”.
About 150,000 posts that were found not to have broken the hate speech policy were restored during that period.
Facebook said the report highlighted “areas where we could be more open in order to build more accountability and responsiveness to the people who use our platform”.