Twitch will now take action against threats, violence and sexual assault on other platforms

Illustration for the article titled Twitch will now take action against threats, violence and sexual assault on other platforms

Image: Thomas Trutschel (Getty Images)

Many Twitch streamers have YouTube channels, and vice versa. They also have Twitter pages, Instagrams, TikToks, SnapChats and basically every other type of account under the sun. Twitch has considered activity on other platforms for years when deciding to suspend or ban streamers, although this is contradictory – and usually only when a breach has also occurred on Twitch or with another Twitch streamer. Now it is trying to turn activities off the platform into a cornerstone of its moderation approach and is working with a third-party law firm to investigate off-platform threats, sexual assault and other forms of harassment and violence.

Twitch announced its new approach to off-platform misconduct in a blog post todayand said that ‘we have historically acted against serious, obvious misconduct that has taken place outside of service, but until now we have not had a scaled-down approach.’ This is where the law firm, which will help the internal team of Twitch, enters.

“This partner is an experienced investigative law firm dedicated to conducting independent workplace and campus investigations, including investigations related to sexual discrimination or assault,” the company wrote, pointing out that it has tools to verify this. and which did not occur. Jerk, but it’s much harder to fix on other platforms. ‘This partnership will enable us to investigate and respond to reports of misconduct. We have also expanded the scope of our response team for internal law enforcement, which is extensively trained to manage sensitive, confidential investigations and cooperate with law enforcement. “

Twitch also gave examples of the kind of behavior that streamers can cast without ceremony, even if it does not occur on Twitch. These include lethal violence and violent extremism, threats of mass violence, terrorist activities or recruitment, leadership or membership of a known hate group, the performance or actions of an accomplice for sexual activities without consent and / or sexual assault, sexual exploitation of children ( including care), threats of violence during events and threats against Twitch staff.

Twitch conceded that the list does not include all forms of harassment and abuse. In an email to Kotaku, a Twitch spokesman said: “We will only act in cases where we have demonstrable evidence, and at the moment we are unable to investigate cases that fall outside the conduct of the policy.” Evidence, in this case, could mean “links, screenshots, video of behavior that is out of whack, interviews, police files, or interactions confirmed by our law enforcement response team or our third-party investigators.”

As with other moderation decisions made by internal Twitch teams, only streamers and others directly involved in investigations will know their outcomes. “We understand that this may be frustrating, but we have sought the support of these third-party investigators to protect the rights, confidentiality and privacy of all involved and to preserve the integrity of these investigations,” the company wrote. and also note that it has created an email for out-of-service misconduct reports —[email protected] – where all information sent by users will be treated as confidential.

It remains to be seen how it will all work out to some degree of moderation where Twitch streamers and viewers have historically clung to more transparency, not less. That said, it appears to be closing a gap in Twitch’s rules. Last September, for example, Twitch allowed LeafyIsHere, a creator banned on YouTube after harassment against streamers like Pokimane, to migrate to its platform. This then allowed him to stream for a few weeks, despite the obvious risks, until he also committed an offense on Twitch. Only then does it forbid him. The problem with the method is that it gives people who have done damage elsewhere a chance to do more. Theoretically, Twitch now has the tools to prevent similar situations from unfolding.

There are reasons to be skeptical. First, Twitch did not provide much information about the law firm he worked with, and the firm’s specific areas of expertise – workplace and campus investigations – are systems in which it is notoriously difficult for victims to get right. There are also red flags in how Twitch recently chose to apply its existing rules outside the platform. Last month, for example, the platform finally longtime streamer Ali “Gross Gore” Larsen banned for inappropriate behavior outside the platform resulting from a “revenge porn” incident, but only thereafter years of second chances to other documented cases from sexual harassment and assault off the platform. It also seems unlikely that these new policies will protect streamers from campaigns for mass harassment from outside the platform the one that Critical Bard endured earlier this year after committing the crime of calmly explaining the history that made Black Lives Matter a necessary move.

If nothing else, Twitch knows he still has work to do.

“Taking action against misconduct that is entirely outside our service is a new approach for both Twitch and the wider industry, but in our opinion – and it is heard from you – it is crucial to correct come, ‘the company wrote. ” Part of that means being clear with you about the limitations of our policies. At this time, we are unable to investigate behaviors that occur completely en Twitch falling outside these categories. It’s a repetitive, ongoing process, and as always, our ultimate goal is to build a safer Twitch for everyone. ”

.

Recommended stories

.Source