Home News Facebook releases May 2020 community standards enforcement report

Facebook releases May 2020 community standards enforcement report

506
0
Access Pensions, Future Shaping

WED, MAY 13 2020-theG&BJournal- Facebook on Tuesday May 12 published the fifth edition of its Community Standards Enforcement Report, providing metrics  on how well it enforced its policies from October 2019 through March 2020.
Announcing the report in a Facebook Newsroom post, Guy Rosen, Vice President, Integrity said the company has in the last few years built tools, teams and technologies to help protect elections from interference, prevent misinformation from spreading on its apps and keep people safe from harmful content.
“So when the COVID-19 crisis emerged, we had the tools and processes in place to move quickly and we were able to continue finding and removing content that violates our policies. When we temporarily sent our content reviewers home due to the COVID-19 pandemic, we increased our reliance on these automated systems and prioritized high-severity content for our teams to review in order to continue to keep our apps safe during this time,” he said.
The report, he said, includes data only through March 2020, so it does not reflect the full impact of the changes the company made during the pandemic. “We anticipate we’ll see the impact of those changes in our next report, and possibly beyond, and we will be transparent about them. For example, for the past seven weeks we couldn’t always offer the option to appeal content decisions and account removals, so we expect the number of appeals to be much lower in our next report. We also prioritized removing harmful content over measuring our efforts, so we may not be able to calculate the prevalence of violating content during this time. Today’s report shows the impact of advancements we’ve made in the technology we use to proactively find and remove violating content,” Rosen said.
For the first time, the report includes metrics across 12 policies on Facebook and metrics across 10 policies on Instagram. The report introduces Instagram data in four issue areas, namely: Hate Speech, Adult Nudity and Sexual Activity, Violent and Graphic Content, and Bullying and Harassment. Also for the first time, the report shared data on the number of appeals people make on content the company took action against on Instagram, and the number of decisions overturned either based on those appeals or when it identifies the issue on its own. The report also contains data on Facebook’s efforts to combat organized hate on Facebook and Instagram.
Highlights the progress the company has made so far in finding and removing violating content, Rosen said Facebook has improved its technology that proactively finds violating content, which helped in the removal of more violating content so fewer people saw it.
Said Rosen: “On Facebook, we continued to expand our proactive detection technology for hate speech to more languages, and improved our existing detection systems. Our proactive detection rate for hate speech increased by more than 8 points over the past two quarters totalling almost a 20-point increase in just one year. As a result, we are able to find more content and can now detect almost 90% of the content we remove before anyone reports it to us. In addition, thanks to other improvements we made to our detection technology, we doubled the amount of drug content we removed in Q4 2019, removing 8.8 million pieces of content.
On Instagram, we made improvements to our text and image matching technology to help us find more suicide and self-injury content. As a result, we increased the amount of content we took action on by 40% and increased our proactive detection rate by more than 12 points since the last report. We also made progress in our work combating online bullying by introducing several new features to help people manage their experience and limit unwanted interactions, and we announced new Instagram controls today. We are sharing enforcement data for bullying on Instagram for the first time in this report, including taking action on 1.5 million pieces of content in both Q4 2019 and Q1 2020.”
Lastly, improvements to Facebook’s technology for finding and removing content similar to existing violations in our databases helped the company take down more child nudity and sexual exploitative content on Facebook and Instagram, Rosen disclosed.
Over the last six months, he said, Facebook has started to use technology more to prioritize content for its teams to review based on factors like virality and severity among others. Going forward, it plans to leverage technology to also take action on content, including removing more posts automatically. This will enable our content reviewers to focus their time on other types of content where more nuance and context are needed to make a decision.
The Community Standards Enforcement Report is published in conjunction with Facebook’s bi-annual Transparency Report that shares numbers on government requests for user data, content restrictions based on local law, intellectual property take-downs and internet disruptions.
“In the future we’ll share Community Standards Enforcement Reports quarterly, so our next report will be released in August,” he said.
|twitter:@theGBJournal|email: info@govandbusinessjournal.com.ng|

Access Pensions, Future Shaping
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments