Meta, the owner of Instagram and Facebook, has confirmed in a report for the first quarter of 2022 that it has taken action against 1.8 billion spam and 21.7 million violent content.
In a report released on Tuesday evening, Transparency International said that in the last quarter of last year, the amount of spam content was 1.2 billion while the amount of violent content was 1.24 million, which was less than the current level.
According to Meta, 1.8 million drug-related posts were removed from Instagram during the first quarter of this year.
Harassment and insulting content identification systems improved during this period, from 67.8% in the previous quarter to 67%.
The report released by Meta reviews the implementation of 14 policies of Facebook and 12 policies of Instagram. According to the agency, the implementation of the policy on displaying sexual content on Facebook and Instagram has also been improved.
During the implementation of the policies of Instagram and Facebook, “sexually explicit material was shown or content with educational and medical purposes was allowed to remain.”
Between January and March 2022, the amount of sexually explicit content on Facebook was 0.04%. It is estimated that four out of every 10,000 views are content that contains nudity or sexuality. In the past, the proportion has been 0.14%.
Meta removed 31 million sexually explicit posts from Facebook, compared to 14 million on Instagram.
Two hundred and seventy-four thousand actions against sexual content were taken on consumer complaints which is 3.30% of the total content while 96.70% of the content has been identified and action has been taken against the users automatically before the complaint.
After an appeal from those who share sexually explicit material, Meta retrieved 287,000 posts on Facebook and 184,000 posts on Instagram.
Meta reports on the implementation of Facebook and Instagram policies on child endangerment, bodily harm, sexual harassment, dangerous institutions, terrorism, fake accounts, hate speech, weapons and drugs, spam, suicide and suicide. Includes actions against related content.
Action was taken against 2.5 million posts related to dangerous organizations and individuals and 16.1 million posts related to individuals and organizations related to terrorism.
Action was taken against 1.6 billion fake accounts. That number was 1.7 billion in the previous quarter.
According to Meta, during the first quarter of this year, the activity of fake accounts on Facebook accounted for 5% of the total active accounts.