Facebook has released its third Community Standards Enforcement Report, covering the fourth quarter of 2018 and the first quarter of 2019. The report has revealed that the social media giant took down 2.19 billion fake accounts in the first quarter of 2019.
Whereas, 1.2 billion fake accounts were disabled in the last quarter of 2018. Facebook’s current active monthly users amount to 2.38 billion. However, the company believes that 5 percent of its monthly active accounts are fake.
“The number of accounts we took action on increased due to automated attacks by bad actors who attempt to create large volumes of accounts at one time,” said Facebook in a blog post.
The company uses three different ways to tackle fake accounts. It blocks accounts from being created, removes them when they sign up, and removes the already existing accounts after verifying them.
The report also contains data on appeals and content restored along with data on regulated goods, i.e., attempts at illicit sales of regulated goods like drug and firearm sales.
Facebook considers this content malicious in its policy areas. These areas include spam, adult nudity and sexual activity, hate speech, bullying and harassment, violence and graphic content, regulated goods (drugs and firearms), terrorist propaganda, child nudity and sexual exploitation of children.
The company further notes in the report that it has made progress in proactively identifying hate speech. It successfully detected 65 percent of the hate speech content in Q1 2019, which was 59 percent in Q4 2018.
The report’s summary can be read here.