Twitch policy update clarifies exactly what it means by 'sexual harassment'

 Twitch policy update clarifies exactly what it means by 'sexual harassment'

It's accompanied by an AutoMod update to better weed out creepy comments.

Twitch has issued new "clarifications" to its sexual harassment policy in a bid to make it easier to understand, released alongside an update to its AutoMod moderation tools designed to let streamers curb inappropriate messages in chat.

In a blog post released yesterday, Twitch wrote that its sexual harassment policy "remains largely unchanged," but that it was adding on a chunk of clarifying language to—it hopes—draw clearer boundaries around the kinds of behaviour that aren't accepted on its platform.

The new language gets a lot more particular about exactly what Twitch doesn't want to see in its chats. "We prohibit unwanted comments–including comments made using emojis/emotes–regarding someone's appearance or body, sexual requests or advances, sexual objectification, and negative statements or attacks related to a person's perceived sexual behaviours or activities, regardless of their gender," read the update guidelines, "We also do not tolerate the recording or sharing of non-consensual intimate images or videos under any circumstances, and may report such content to law enforcement."

In other words, Twitch is now trying to give its audience a robust list of specific behaviours it can use as a reference when deciding what's kosher or not to post in chat, rather than a broader and more vague ban on creepy behaviour. 

The platform goes on to emphasise that it's (as you might expect) really the "non-consensual" element of that stuff it has a problem with, and encourages anyone who thinks they've been targeted for "mutual, consensual comments" to file an appeal. It notes, though, that some comments—like "expressing a desire to commit sexual violence"—are never allowed on Twitch regardless of any other factors.

Twitch is also updating AutoMod, the machine learning gizmo streamers can use to nip inappropriate comments in the bud automatically, to "better combat sexual harassment.

"We’ve developed a new AutoMod category that will allow you to filter out chat messages that could be considered sexual harassment," says Twitch, "This new category will provide an additional layer of protection, and can help to block those messages in the moment, before they show up in chat." 

Streamers will be able to decide how strict AutoMod is about withholding potentially problematic comments. Alongside their mods, they'll be able to review held messages to determine what to do with them. The new tools rolled out yesterday, although they're currently English-only.

Sexual harassment is, of course, absolutely endemic in online spaces, and Twitch is no exception. Numerous streamers—the vast majority of them women—have plenty of horror stories of viewers crossing the line in one way or another, whether it's creepy comments or outright stalking. Tools like these are an unfortunate necessity, though by no means adequate to stomp the problem out entirely. 

Читайте на 123ru.net