Reddit Wants to Exile Trolls. But Growing Up Is Hard

Reddit is finally growing up. Or at least it's trying.
1122385632S
Person trapped in box templateGetty Images

Reddit is finally growing up. Or at least it's trying.

For a decade, the site has been known for its anonymous user-driven community, volunteer-moderated forums, and its rigid defense of the most notoriously offensive subreddits and trolls on free speech grounds. But that hands-off attitude is slowly changing.

The company is the latest social media site to introduce updates to its policies explicitly banning the harassment of individual users. According to the new rules, the site now prohibits systematic attacks that make an individual feel Reddit is an unsafe place to express ideas, or that make someone fear for their safety (or the safety of those around them).

“We’ve seen many conversations devolve into attacks against individuals,” the Reddit team explained in a blog post announcing the policy yesterday. “Some users are harassing people across platforms and posting links on Reddit to private information on other sites.” The site seems to hope that the ban will help curtail future targeted abuse.

Whether or not the new policy curbs Reddit's culture of trolling, it's still significant that Reddit is admitting it has a problem---and one that it has not addressed well enough in the past. In a recent survey of 15,000 Redditors, the company found that users feel uncomfortable contributing to the site because of negative responses to comments. In fact, the top reason users don’t use or recommend the site is to avoid “hate and offensive content.” (One Redditor told me users on her subreddit are frequently called “Nazis, fascists, and fatties” for trying to keep hateful people out.)

But, as on Twitter, Reddit’s effort to halt harassment won’t be easy. While an anti-harassment policy seems necessary for a site that played a central role in both Gamergate and the celebrity selfie hack, ferreting out online abuse is complicated---and stopping it even more so.

Under the new policy, Reddit now encourages users who feel threatened to email the company to report it. With more than 9,000 active communities and millions of visitors, the small Reddit team (total employees: 76) will need to read, assess, and address each report. The policy does not explain how, exactly, harassment will be addressed, nor does it go into detail about whether other kinds of pervasive abuses will be dealt with, such as when non-Redditors are targeted.

The problem for Reddit (in which Advance Publications, WIRED's parent company, owns a stake) is one of the classics: how much freedom is too much? The site was founded as an open forum, like the Internet itself, where conversation is driven by the community, freedom of expression is prized above all, and the interference by authorities minimal. Redditors can start their own subreddits, which are moderated by volunteers who establish self-contained (and self-enforced) rules. At least that's the ideal vision.

In practice, trolls can all too easily exploit Reddit's seemingly naive faith in self-regulation. Subreddits are overrun with abusers to the point that they become unmanageable, their original intent ruined by bad actors seeking to drown out the voices with which they disagree. In effect, a site supposedly devoted to freedom of speech becomes a venue where the only people who get to be heard are the ones who want to shut everyone else up.

A Step Toward Change

The new anti-harassment policy appears to be an effort to push back against that paradox. "Instead of promoting free expression of ideas, we are seeing our open policies stifling free expression; people avoid participating for fear of their personal and family safety," the company wrote.

But some Redditors are concerned that these new updates don’t go far enough, especially since it seems unlikely Reddit will use the policy to take down subreddits whose offensiveness some redditors believe veers into outright harassment, such as those that promote white supremacy, fat-shaming, or photographing women without their consent. “There are a lot of ways to view the content of an entire subreddit and harassment,” a spokesperson from Reddit wrote in response to questions from WIRED. “Views we disagree with or find offensive will not be affected. Posts that meet the criteria of harassment stated [in the blog post] will be addressed.”

Other redditors didn’t think the changes will have a real impact. “Let’s say I feel that Reddit, as a whole, is 'unsafe' for my ideas. And I get one or two or even six users banned due to harassment. How does that change Reddit as a whole?” asked one user kvachon. “It doesn’t, it just (temporarily) removes specific users from the site. The subreddit and culture will be the same. Why not just add an ignore button?”

For now, Reddit is hoping that this move can curtail some of the most egregious abuses of individuals. And that could be a start toward real change. “This won’t solve all the issues of harassment on Reddit or the Internet,” says Washington University law professor Neil Richards. “But it’s a good first step. It’s important to change the social norms and values of the site, to create an emerging culture of free expression online that is sensitive to harassment.”

Because Reddit wants to grow up. That much is clear. The company is launching its own videos, it's building out its ad team, and it released its first transparency report earlier this year. After all, Reddit has a business to run. But with every growth spurt, there are growing pains. It’s unclear if this one will hurt redditors, subreddits, trolls, or help no one at all.

Or maybe Reddit really can change. “Reddit has this reputation as the Wild West, but the Wild West turned into Kansas,” Richards says.