Out of every 10,000 times people who view Facebook content, 11 to 14 views contains nudity and sexual content that violates Facebook’s policies.
From October 2018 to March 2019, Facebook has removed more than 3 billion fake accounts responsible for adult content. Facebook has also quoted that 5% of its monthly active users base is fake.
Guy Rosen, Facebook’s vice president for integrity, said in a blog post on Thursday.
“For fake accounts, the amount of accounts we took action on increased due to automated attacks by bad actors who attempt to create large volumes of accounts at one time.”
“We estimated for every 10,000 times people viewed content on Facebook, 25 views contained content that violated our violence and graphic content policy.”
“During the second half of 2018, the volume of content restrictions based on local law increased globally by 135 percent from 15,337 to 35,972.”
In the second half of 2018, Facebook identified 53 disruptions of Facebook services in nine countries, compared to 48 disruptions in eight nations in the first half of 2018.
In this period, on Facebook and Instagram, the company took down 2,595,410 pieces of content based on 511,706 copyright reports; 215,877 pieces of content based on 81,243 trademark reports; and 781,875 pieces of content based on 62,829 counterfeit reports.
Facebook has also quoted…
“This increase was primarily driven by 16,600 items we restricted in India based on a Delhi High Court order regarding claims made about PepsiCo products”.
“This half, India accounted for 85 per cent of total new global disruptions,” said the company.
“In Q1 2019, we took action on about 900,000 pieces of drug sale content, of which 83.3 per cent we detected pro-actively. In the same period, we took action on about 670,000 pieces of firearm sale content, of which 69.9 per cent we detected pro-actively,” added Rosen.
Source: The News Minute
Submit Startup Story ➔