1

‘Among Us’ Has a Moderation Problem

 3 years ago
source link: https://onezero.medium.com/among-us-has-a-moderation-problem-266a1d1d8ed1
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

‘Among Us’ Has a Moderation Problem

The game’s sudden popularity caught the small development team by surprise — and now there’s a lot of work to be done

Image for post
Image for post
Photo: IGDB

At any given moment, there are tens of thousands of people playing Among Us on Steam, the largest platform for PC gaming. Many more join them on their mobile phones and Nintendo Switches. All told, half a billion people reportedly play the game every month. It’s a sprawling player base that’s largely based around text interactions — you complete tasks as a team of colorful astronauts, convening every now and again to root out imposters attempting to sabotage your mission.

It’s also almost entirely unmoderated. Innersloth — the four-person team behind the game — is working with limited resources to build crucial moderation tools and keep bad actors out. Despite promises that such tools would be implemented, Among Us currently contains no basic reporting mechanism or blocking tools to keep gamers — especially teenagers and children — safe.

Online multiplayer games from publishers with larger staffs and more years of experience tend to use robust moderation tools to prevent abuse on their platforms (with varying levels of success). Roblox uses an automated system to filter harsh language and intervene when it appears that a player is trying to lure someone off-platform, for example. But Among Us shows us what popular social games look like without such tools. The result is reminiscent of early internet chat rooms, where adults and children play together in semi-private, unmoderated chats with minimal accountability.

“If you open a hole on the internet, it gets filled with shit.”

OneZero has witnessed players sharing personal information within the game, like their social media handles or real-life location, as well as players self-identifying as adults having sexualized conversations with players identifying as teenagers.

“A senior exec at a digital media company told me very early on… If you open a hole on the internet, it gets filled with shit,” Sarah T. Roberts, an associate professor of information studies at the University of California, Los Angeles and a content moderation expert, told OneZero. In other words, without proper moderation tools, there’s nothing to prevent the chat rooms in Among Us from becoming a haven for abusive players. Players do not create game-specific profiles when they start playing Among Us. They can connect specific store accounts to the game if they so choose — to sync cosmetic purchases across devices using their Steam ID, for example — but this information is never displayed to other players.

Instead, players are identified by usernames that they can change every game if they choose. While usernames go through a profanity filter to prevent some crass language, there’s no way to find a player after they’ve left a game, identify the same user playing on a different platform, or report their account from within the game itself. If something happens and you want to report a player, you have to find a support email address, describe the situation, and hope for the best. The email address is [email protected], which is referenced in some of the company’s dev log updates, but it can’t be found without leaving the game.

When reached by OneZero in December, Victoria Tran, community director at Innersloth, referenced this email address and promised that accounts would soon be added to Among Us.

“We have a support email that players can use to contact us, but more importantly, we’re implementing accounts into the game for higher reporting abilities,” Tran said.

OneZero first reached out to Innersloth on December 9, and on December 10, the developer said the feature would arrive “in the coming weeks.” OneZero followed up on January 5 and was told “We’re hoping to push out a dev log by the end of this week,” but that update instead landed on January 19. It touted a new map, the game’s arrival on the Nintendo Switch and Epic Store, and inclusion on the Xbox Game Pass for PC. It also addressed the difficulty of building out new features with a four-person team, saying that even hiring new people to help “doesn’t automatically make development go faster — it slows it down,” if not done properly. On the subject of moderation tools, the developer said they’re “hoping to get this out ASAP.”

“We aren’t just going to depend on bots/automation to handle these things since they’re so fallible for reasons I’m sure you’re aware of. So there will be actual people looking at the reports,” Tran told OneZero when reached again in January. It’s not merely an issue of adding a “Report player” button. Someone has to be on the other end to receive those reports, and that means building out more infrastructure.

Until moderation features arrive, players have very few options to report or moderate abuse in the game. Game hosts can kick or ban other players from an individual room, but without player profiles, there’s no way to follow up and report players for abuse or suspicious activity outside of the support email.

In one game viewed by OneZero, a player volunteered the city and street where they live. Other players sometimes share Snapchat or Instagram usernames to find each other later, and there are multiple missed connection Facebook groups where players try to reconnect with other players. Preventing young people from sharing personal information online is challenging even for the tech giants, but when it’s not possible to report suspicious players for soliciting information, it can be even harder.

Among Us game rooms can be listed publicly for others to join. A maximum of 10 players can join each game, which means very few people can even witness suspicious or abusive activity in any given room. And until accounts are implemented, players can only be identified by the color of their avatar and the username they chose for that particular game.

For example, in one game observed by OneZero, a lime-colored game host with the username “Im a boy i” identified themselves as 18 years old and asked the rest of the group how old they were. The pink player, with the username “plueberry,” identified themselves as 13 years old. At the end of the game, Lime banned nearly everyone in the chat one by one, leaving only Pink and one other player in the game.

Since bans are applied one at a time, it’s impossible to know whether Lime banned anyone else after OneZero was booted, but this control would make it possible for Lime to turn the game into a direct chat with a person who had previously identified themselves as underage. Predators sometimes use the chat function of games like Fortnite or Roblox to convince younger players to connect in private chats outside the game. But in Among Us, the person who controls the room can change a public room to a private one, preventing any new players from joining, effectively creating a private direct message channel in the game itself. While the room will eventually expire if no game is started (and a game can’t be started with just two players), this allows for up to 10 minutes of private, unmoderated chat between strangers, with only an optional filter that censors a few bad words between them.

Image for post
Image for post
Image for post
Image for post
Image for post
Image for post

Reporting this incident as suspicious would involve emailing a generic support line and reporting a Lime-colored player with the name “Im a boy i.” This isn’t the kind of thing most players are likely to do (or even know how to do), and it’s impossible to know whether multiple players across all games fit these criteria.

In another game, players with names including “HornyBoi,” “sxy gurl,” “hrnygirl,” and “bby gurl,” identified themselves as ages ranging from 14 to 18. At one point, “bby gurl” (Pink) told the room to play Truth or Dare and asked “HornyBoi” (Blue) to “tell us your darkest secret.” Blue responds, “I’ve sexted on here before.”

Image for post
Image for post
Image for post
Image for post
Image for post
Image for post

Since there is no identity or even account verification, it’s impossible to know if any of the information players share is true. It’s possible that the players identifying as teenage girls are, in reality, fully grown men. Moreover, players choose names before joining a room, so the similar-sounding names could imply that these players all knew each other before joining the room. The Truth or Dare game could be an elaborate roleplay, a practical joke, or a predatory scheme. Players can’t tell which, and the only way to inform the developers so they can investigate is through its support email address. Meanwhile, the only in-game moderation feature — language filters — are easy to bypass by, for example, using a username like “HornyBoi.” instead of “Horny Boi,” which does get blocked.

“Unfortunately, for the small group of developers who have, unwittingly perhaps, had this megahit on their hands, they are now in the position… to be in a certain way, a victim of their own success,” UCLA’s Roberts told OneZero.The developers are now faced with the challenge of building out moderation features after the game exploded in popularity. “Which is the worst position to be in because, in essence, it’s already on fire.”

Moderating a community of this size and complexity would be no easy feat even for gigantic developers. In larger games like Fortnite (owned by Epic Games) and Minecraft (owned by Microsoft), developers use a combination of human and automated moderation, identity verification, and community education. Expecting a team of four people, working on a 2018 game that suddenly blew up in popularity two years after its release, to get it right during a pandemic is unrealistic.

However, it highlights a glaring issue for indie developers. Small teams are capable of reaching wide audiences, often simply by the happenstance of being featured by the right Twitch or YouTube streamer. Tools that may be adequate for a small indie audience can grow woefully inadequate for a global audience. An email reporting system might be fine for a small audience, but a player base of 500 million people requires more robust automated tools, manual reporting options, and raw manpower than that.

“My advice to anyone in this arena is to think about, what do bad guys do? And how do they do it? And how can you solve for that?” Roberts said. “Because if you open a hole on the internet, we know what it gets filled with.”

Update: A previous version of this article misstated Sarah T. Roberts’ title. She is an associate professor. Additionally, a quote about holes on the internet has been updated to include a preceding clause from the conversation that clarifies the quote’s origin.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK