Advertisement
Canada markets closed
  • S&P/TSX

    21,873.72
    -138.00 (-0.63%)
     
  • S&P 500

    5,071.63
    +1.08 (+0.02%)
     
  • DOW

    38,460.92
    -42.77 (-0.11%)
     
  • CAD/USD

    0.7298
    +0.0001 (+0.01%)
     
  • CRUDE OIL

    82.69
    -0.12 (-0.14%)
     
  • Bitcoin CAD

    88,285.93
    -3,135.41 (-3.43%)
     
  • CMC Crypto 200

    1,394.60
    -29.50 (-2.07%)
     
  • GOLD FUTURES

    2,330.20
    -8.20 (-0.35%)
     
  • RUSSELL 2000

    1,995.43
    -7.22 (-0.36%)
     
  • 10-Yr Bond

    4.6520
    +0.0540 (+1.17%)
     
  • NASDAQ futures

    17,483.00
    -181.50 (-1.03%)
     
  • VOLATILITY

    15.97
    +0.28 (+1.78%)
     
  • FTSE

    8,040.38
    -4.43 (-0.06%)
     
  • NIKKEI 225

    37,977.16
    -482.92 (-1.26%)
     
  • CAD/EUR

    0.6819
    0.0000 (0.00%)
     

Facebook's 'oversight' body overturns four takedowns and issues a slew of policy suggestions

Facebook's self-regulatory 'Oversight Board' (FOB) has delivered its first batch of decisions on contested content moderation decisions almost two months after picking its first cases.

A long time in the making, the FOB is part of Facebook's crisis PR push to distance its business from the impact of controversial content moderation decisions -- by creating a review body to handle a tiny fraction of the complaints its content takedowns attract. It started accepting submissions for review in October 2020 -- and has faced criticism for being slow to get off the ground.

Announcing the first decisions today, the FOB reveals it has chosen to uphold just one of the content moderation decisions made earlier by Facebook, overturning four of the tech giant's decisions.

ADVERTISEMENT

Decisions on the cases were made by five-member panels that contained at least one member from the region in question and a mix of genders, per the FOB. A majority of the full board then had to review each panel's findings to approve the decision before it could be issued.

The sole case where the board has upheld Facebook's decision to remove content is case 2020-003-FB-UA -- where Facebook had removed a post under its Community Standard on Hate Speech that had used the Russian word “тазики” (“taziks”) to describe Azerbaijanis, who the user claimed have no history compared to Armenians.

In the four other cases the board has overturned Facebook takedowns, rejecting earlier assessments made by the tech giant in relation to policies on hate speech, adult nudity, dangerous individuals/organizations, and violence and incitement. (You can read the outline of these cases on its website.)

Each decision relates to a specific piece of content but the board has also issued nine policy recommendations.

These include suggestions that Facebook [emphasis ours]:

  • Create a new Community Standard on health misinformation, consolidating and clarifying the existing rules in one place. This should define key terms such as “misinformation.”

  • Adopt less intrusive means of enforcing its health misinformation policies where the content does not reach Facebook’s threshold of imminent physical harm.

  • Increase transparency around how it moderates health misinformation, including publishing a transparency report on how the Community Standards have been enforced during the COVID-19 pandemic. This recommendation draws upon the public comments the board received.

  • Ensure that users are always notified of the reasons for any enforcement of the Community Standards against them, including the specific rule Facebook is enforcing. (The board made two identical policy recommendations on this front related to the cases it considered, also noting in relation to the second hate speech case that "Facebook’s lack of transparency left its decision open to the mistaken belief that the company removed the content because the user expressed a view it disagreed with.")

  • Explain and provide examples of the application of key terms from the Dangerous Individuals and Organizations policy, including the meanings of “praise,” “support” and “representation.” The Community Standard should also better advise users on how to make their intent clear when discussing dangerous individuals or organizations.

  • Provide a public list of the organizations and individuals designated as "dangerous" under the Dangerous Individuals and Organizations Community Standard or, at the very least, a list of examples.

  • Inform users when automated enforcement is used to moderate their content, ensure that users can appeal automated decisions to a human being in certain cases, and improve automated detection of images with text-overlay so that posts raising awareness of breast cancer symptoms are not wrongly flagged for review. Facebook should also improve its transparency reporting on its use of automated enforcement.

  • Revise Instagram’s Community Guidelines to specify that female nipples can be shown to raise breast cancer awareness and clarify that where there are inconsistencies between Instagram’s Community Guidelines and Facebook’s Community Standards, the latter take precedence.

Where it has overturned Facebook takedowns the board says it expects Facebook to restore the specific pieces of removed content within seven days.

In addition, the board writes that Facebook will also "examine whether identical content with parallel context associated with the board’s decisions should remain on its platform." And says Facebook has 30 days to publicly respond to its policy recommendations.

So it will certainly be interesting to see how the tech giant responds to the laundry list of proposed policy tweaks -- perhaps especially the recommendations for increased transparency (including the suggestion it inform users when content has been removed solely by its AIs) -- and whether Facebook is happy to align entirely with the policy guidance issued by the self-regulatory vehicle (or not).

Facebook created the board's structure and charter and appointed its members -- but has encouraged the notion it's 'independent' from Facebook, even though it also funds FOB (indirectly, via a foundation it set up to administer the body).

And while the Board claims its review decisions are binding on Facebook there is no such requirement for Facebook to follow its policy recommendations.

It's also notable that the FOB's review efforts are entirely focused on takedowns -- rather than on things Facebook chooses to host on its platform.

Given all that it's impossible to quantify how much influence Facebook exerts on the Facebook Oversight Board's decisions. And even if Facebook swallows all the aforementioned policy recommendations -- or more likely puts out a PR line welcoming the FOB's "thoughtful" contributions to a "complex area" and says it will "take them into account as it moves forward" -- it's doing so from a place where it has retained maximum control of content review by defining, shaping and funding the "oversight" involved.

TL;DR: An actual supreme court this is not.

In the coming weeks, the FOB will likely be most closely watched over a case it accepted recently -- related to the Facebook's indefinite suspension of former U.S. president Donald Trump, after he incited a violent assault on the U.S. capital earlier this month.

The board notes that it will be opening public comment on that case "shortly."

"Recent events in the United States and around the world have highlighted the enormous impact that content decisions taken by internet services have on human rights and free expression," it writes, going on to add that: "The challenges and limitations of the existing approaches to moderating content draw attention to the value of independent oversight of the most consequential decisions by companies such as Facebook."

But of course this "Oversight Board" is unable to be entirely independent of its founder, Facebook.

Update: In a genuinely independent response to the FOB's decisions, the unofficial "Real Facebook Oversight Board" -- whose ranks are comprised of people Facebook did not handpick -- produced a withering assessment, saying the rulings are riven with "deep inconsistencies" and set a "troubling precedent for human rights."

"The Oversight Board's rulings confirm Facebook's worst-kept secret -- it has no moderation strategy and no clear or consistent standards," the Real Facebook Oversight Board added.

Update 2: In a public response to the FOB's first decisions, Facebook said: "We will implement these binding decisions in accordance with the bylaws and have already restored the content in three of the cases as mandated by the Oversight Board. We restored the breast cancer awareness post last year, as it did not violate our policies and was removed in error."

It added that would consider the "numerous policy advisory statements", the FOB has also issued, noting that it has up to 30 days to "fully consider and respond".

"We believe that the board included some important suggestions that we will take to heart. Their recommendations will have a lasting impact on how we structure our policies," it added.

Facebook's initial response to the FOB's first decisions makes no direct mention of the latter's decision to overturn a hate speech takedown and order the reinstatement of a post by a user in Myanmar -- which had suggested there is something wrong with the mindset of Muslims.

The post referenced two widely shared photographs of a Syrian child of Kurdish ethnicity who had drowned attempting to reach Europe in September 2015 -- with accompanying text that questioned the lack of response by Muslims generally to the treatment of Uyghur Muslims in China, compared to killings in response to cartoon depictions of the Prophet Muhammad in France, per the FOB, before concluding that recent events in France reduced the poster's sympathies for the depicted child, and seeming to imply the child may have grown up to be an extremist.

"The Board considered that while the first part of the post, taken on its own, might appear to make an insulting generalization about Muslims (or Muslim men), the post should be read as a whole, considering context," the FOB wrote, explaining how it decided the post was not inciting hatred against Muslims.

However the Board's decision to overturn the takedown -- in a region where Facebook's platform has for years been implicated in accelerating ethnic violence -- faced incredulity on social media and strong condemnation from the independent Real Facebook Oversight Board.