5

Facebook, Twitter pledge to fight abuse of women but leave lots of room for fail...

 2 years ago
source link: https://arstechnica.com/tech-policy/2021/07/facebook-twitter-pledge-to-fight-abuse-of-women-but-leave-lots-of-room-for-failure/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Enough already —

Facebook, Twitter pledge to fight abuse of women but leave lots of room for failure

Will they just throw more algorithms at the problem?

Tim De Chant - 7/2/2021, 4:22 PM

Young woman online shopping on smartphone while lying in bed in darkness at night

Four major social media companies yesterday pledged to make efforts to improve the safety of women on their platforms. Facebook, Google, TikTok, and Twitter all signed the pledge in response to the recommendations of a working group of 120 experts organized by the Web Foundation.

“For too long, women have been routinely harassed, attacked, and subsequently silenced in online spaces. This is a huge threat to progress on gender equality,” said Web Foundation Senior Policy Manager Azmina Dhrodia in a statement. Abuse against women online has reached epidemic proportions, with 38 percent of women reporting personal experience with online violence, according to an Economist Intelligence Unit report.

The pledges represent a step in the right direction, but it's unclear if they’ll do much to address online abuse. “I’m really grateful to Web Foundation for work that they’ve done, genuinely, to get platforms to make some efforts in this regard because there’s been very little movement over time,” Sarah Sobieraj, a professor at Tufts University and faculty associate at the Berkman Klein Center for Internet and Society, told Ars.

“Having said that, looking at the commitments, it’s pretty difficult to know what they’re going to mean or what they’re going to look like,” she said. “They’re very open-ended. A lot depends on how much the platforms are willing to do.”

Eight pledges

The pledges outline eight tactics split evenly between two broad categories: curation and reporting.

The companies promise to:

Build better ways for women to curate their safety online by:

  1. Offering more granular settings (e.g. who can see, share, comment, or reply to posts)
  2. Using more simple and accessible language throughout the user experience
  3. Providing easy navigation and access to safety tools
  4. Reducing the burden on women by proactively reducing the amount of abuse they see

Implement improvements to reporting systems by:

  1. Offering users the ability to track and manage their reports
  2. Enabling greater capacity to address context and/or language
  3. Providing more policy and product guidance when reporting abuse
  4. Establishing additional ways for women to access help and support during the reporting process

Indeed, those commitments don’t sound very specific, and users may question how thoroughly the pledges will be implemented—social media companies have made it a habit of asking for forgiveness when their half-efforts have fallen short. The way the pledges are written gives the platforms plenty of leeway.

Advertisement

Much of the emphasis is on giving users greater control over what they see and improving the reporting process. Many social media platforms have opaque reporting processes that frequently result in no action being taken against harassers. A moderator reviewing a report may misunderstand the context of a post, for example, or fail to grasp the meaning of coded language. Those shortcomings mean that enforcement of existing policies is often uneven.

Even if platforms were to enforce their policies more consistently, the pledges above still mostly place the burden of dealing with abuse on the user, not the platform. “These mechanisms for flagging or reporting are vital—they’re absolutely essential,” Sobieraj said. “But you can’t unsee or unread the content that comes through to you. It’s a lot of work for the women who are heavily targeted. It’s time consuming, but also it’s very upsetting.”

“Not only is it a burden on recipients,” she added, “but also, the senders still know that the recipients have to read [the posts] in order to [report them]. [The abusers] still get the satisfaction. It really needs to be disincentivized.”

To cope with abuse, some women have taken to outsourcing the management of their social media accounts. It’s a drastic measure, though, and it's costly, making it unavailable to the vast majority of women on social media.

“We see that the abuse is especially burdensome for women of color, women from religious minority groups, queer women, and so on,” Sobieraj said. “If people start to recede from participating in public spaces, it’s not fair to them, but it also isn’t fair to the rest of us. The people who leave, there’s a pattern. It’s not that it’s random. We’re going to lose certain groups of voices. Maybe it doesn’t feel like that’s a big deal if you’re talking about pop culture, but if you’re talking about politics, it’s pretty important. And I’d argue that it’s pretty important about pop culture, too.”

Advertisement

Proactive solutions

Rather than keeping the burden of combatting abuse largely on users, Sobieraj said social media platforms should take more proactive steps when people become the object of online violence. For one, repeat offenders who aren’t yet banned should have their posts held for moderation, effectively forcing the platform to sign off on the content before it's visible to anyone, Sobieraj said. She suggested that platforms monitor content trends for people who would likely be the target of abuse and moderate the mentions of the person until the attention wanes.

Preemptive moderation in these cases would take the burden off targeted users so they and others wouldn’t have to see the content. It would also remove a key incentive that motivates many abusers, without disrupting everyone else, Sobieraj said. With those safeguards in place, the vast majority of content could still flow freely on social platforms.

Of course, these solutions would require platforms to take a more active role in content moderation. The companies might have this sort of enforcement in mind—the fourth pledge under “curation” could be interpreted that way. But history has shown that social media companies are loath to preemptively moderate content.

The other pledges are helpful but not revolutionary. To follow through on them, platforms likely wouldn’t have to change much. The companies could simply tweak existing systems that are already in dire need of improvement. The result could end up being more algorithmic Band-Aids on problems that algorithms still haven’t solved.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK