Home Daily Infographics Moderating Facebook: The Dark Side of Social Networking

Moderating Facebook: The Dark Side of Social Networking

facebook

Many websites strive to provide the best possible environment for their users, so that people will keep coming back to the site. They want their visitors to feel safe, and to not be exposed to inappropriate content. In order to do that, many of them use professional content moderators to sort through reported and flagged content and decide what’s inappropriate.

Social media moderation is a booming business. There are many firms that specialize in it, who often outsource the work cheaply to workers in India, the Philippines, and other countries with a lower cost of living. There are content moderators in many countries, though, with recent college graduates making up the bulk of them in the United States.

Moderating Facebook

Even though you’re annoyed every day by the viral videos, different games requests and clickbait headlines, Facebook could be a much worse place. A bunch of moderators filter through the worst of humanity that gets posted every day.

Facebook Generates Massive Traffic

Moderating Facebook is not an easy job. Let’s take a look at the size of Facebook:

  • 35 billion active users each month
  • 5 billion likes each day
  • 864 million daily log ins
  • 5 new profiles created every second
  • 83 million total fake profiles
  • 300 million photo uploads each day
  • 75 billion shared posts each day
  • 20-minute average time on the site each visit

Who is moderating Facebook Content?

There are between 800-1,000 Facebook moderators worldwide and they:

  • make up 1/3 of Facebook’s full-time workforce
  • speak 24 languages
  • many positions are outsourced to countries like India and the Philippines
  • moderators in these countries may reportedly make less than $1 an hour
  • many recent college graduates in the US become Facebook moderators
  • can’t find work in their fields
  • looking for advancement
  • average employment for a Facebook moderator is only 3-6 months
  • earn low wages
  • often burnout from dealing with disturbing content
  • psychologist has suggested some moderators might suffer from a form of Post-Traumatic Stress Disorder (PTSD)

The Darkness Faced by Moderators

So the question is which content is get moderated?

Inappropriate sexuality

  • Pedophilia
  • Nudity
  • Necrophilia

Graphic content

  • Human bodies (Beheadings,Suicides,Murders)
  • Animal abuse

Illegal activity

  • Drugs
  • Harassment and threats
  • Domestic violence

Some Facebook users feel the content moderation is inconsistent

  • Okay – male nipples
  • Not okay – breastfeeding
  • Okay – most bodily fluids
  • Not okay – bodily fluids with a human shown

How is Facebook Moderated?

Facebook partners with organizations to improve policies and report handling:

  • Safety Advisory Board (Advises on user safety)
  • National Cyber Security Alliance (Educates users on account and data security)
  • More than 20 suicide prevention agencies (Lifeline, Samaritans, AASARA)

User can report and categorize the content as one of the following:

  • “I don’t like this post”
  • Harassment
  • Hate speech
  • Spam
  • Sexually explicit
  • Violence or harmfoul behavior (Threats of vandalism or violence, Graphic violence, Illegal drug use, Self harm or suicidal content)

Automatic algorithm sorts through reports before a human sees them. The reports are categorized and sent to a moderation team. Categories include:

  • Safety (Graphic violence)
  • Hate and harassment (Hate speech)
  • Access (Hacker and imposter issues)
  • Abusive content (Sexually explicit)

The moderation teams decide if content violates community standards. Moderators have three options for response:

  • Ignore
  • Delete (Warn the poster, Disable the account)
  • Escalate – Escalated content is forwarded to a California-based team (they may alert authorities to deal with threats and dangerous situations)

Most reports are reviewed within 72 hours. Internet is full of offensive content, and social media is no different. Facebook is one of many social networks that tries to clean up our newsfeeds so we can enjoy baby photos and funny videos without having to filter through the scariest the web has to offer.

Take a look at this infographic and learn how is Facebook moderated and who is  moderating Facebook content.

Moderating Facebook: The Dark Side of Social Networking