2

Meta’s Oversight Board Criticizes the Company’s More Lenient Moderation Approach...

 1 year ago
source link: https://www.socialmediatoday.com/news/Oversight-Board-Criticizes-Metas-Lenient-Moderation-of-Celebrities/638120/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Meta’s Oversight Board Criticizes the Company’s More Lenient Moderation Approach for Celebrities

Published Dec. 6, 2022

By
Andrew Hutchinson Content and Social Media Manager

Meta’s Oversight Board has criticized the company’s differentiated moderation system for high-profile users, which can sometimes see rule-violating content from celebrities and politicians left up on the platform for months, while for regular users, the same would be removed in just days.

The comments are part of the Oversight Board’s review of Meta’s ‘Cross Check’ system, which adds an additional layer of moderation for high-profile users.

Here’s how it works – with Meta overseeing more than 100 million enforcement actions every day, it’s inevitable that some things will slip through the cracks, and that some content will be removed or left up that shouldn’t have been. Because high-profile users generally have a much larger audience in the app, and thus, what they say can carry more weight, Meta has an additional, specialized moderation system in place which double checks enforcement decisions for these users.

In other words, celebrities are held to a different standard than regular users in regards to how their content is moderated in the app. Which is not fair, but again, given their broader audience reach, there is some logic to Meta’s approach in this respect.

So long as it works as intended.

Last year, the Wall Street Journal uncovered this alternative process for celebrities, and highlighted flaws in the system which can effectively see high-profile users held to a different standard, and left essentially unmoderated while others see similar comments removed. That then prompted Meta to refer its Cross Check system to its Oversight Board, to rule on whether it’s a fair and reasonable approach, or if something more could, and/or should, be done to improve its system.

And today, the Oversight Board has shared its key recommendations for updating Cross Check:

Meta Cross Check

Its additional comments were fairly critical - as per the Oversight Board:

While Meta told the Board that cross-check aims to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns. By providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm.”

In its analysis, the independent Oversight Board found the Cross Check system to be flawed in several areas, including:

  • Delayed removal of violating content
  • Unequal access to discretionary policies and enforcement
  • Failure to track core metrics
  • Lack of transparency around how Cross Check works

Because of the differentiated enforcement approach, the Oversight Board has recommended that Meta revamp the Cross Check system, and provide more insight into how it works, to ensure that celebrities are not being held to a different standard than regular users.

Which is in line with most of the Oversight Board’s recommendations. A key, recurring theme of all of its reviews is that Meta needs to be more open in how it operates, and how it manages the systems that people interact with every day.

Really, that’s the key to a lot of the issues at hand – if social platforms were more open about how their algorithms influence what you see, how their recommendations guide your behavior in-app, and how they go about deciding what is and is not acceptable, that would make it much easier, and more defensible, when actions are taken by each.

But at the same time, being totally open could also prompt even more borderline behavior. Meta CEO Mark Zuckerberg has previously noted that:

“…when left unchecked, people will engage disproportionately with more sensationalist and provocative content. Our research suggests that no matter where we draw the lines for what is allowed, as a piece of content gets close to that line, people will engage with it more on average — even when they tell us afterwards they don’t like the content.”

Maybe, by being more open about the specifics, that could prompt more users, keen to maximize engagement, to push their boundaries, while enhanced detail could also provide more opportunities for scammers and spammers to get into the cracks, which is likely harder if Meta doesn’t communicate the specifics.

But from a rules perspective, Meta does need to have more specific policies, and more specific explainers that detail violations. It has improved on this front, but again, the Oversight Board has repeatedly noted that more context is needed, with more transparency in its decisions.

I guess, the other consideration here is labor time, and the capacity for Meta to provide such insight at a scale of 2 billion users, and millions of violations every day.

There are no easy answers, but again, the bottom line recommendation from the Oversight Board is that Meta needs to provide more insight, where it can, to ensure that all users understand the rules, and that everyone is then treated the same, celebrity or not.

You can read more about the Oversight Board’s recommendations here.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK