In context: Reddit's new anti-harassment policies have already spelled the end of certain online communities that were a toxic presence on the platform. The changes have raised some concerns, but the company doesn't have a lot of choice, as its business depends on it.

Reddit has broadened the scope of its policies to make it easier to shoot down abuse on the platform. The changes should make the life of content moderators easier, as the previous policies were too narrow and didn't allow them to take action unless a user would repeatedly violate the rules.

The old definition of harassment on Reddit for "harassment, threatening, or bullying" required tracking a user's behavior until there was evidence of "continued" and/or "systematic" abuse. It also required the victim to "fear for their real-world safety" before a moderator could intervene. As Reddit administrator Landoflobsters notes, this created a lot of confusion and made it hard to take swift action unless the situation became severe.

The new policy now describes abuse as "anything that works to shut someone out of the conversation through intimidation or abuse, online or off." Following someone across the platform's many communities or using private messages to threaten or bully them will warrant swift action. This extends to those situations where groups of people bully a person and vice versa, with several subreddits now banned as a result. Examples include r/Braincels, a community of "involuntary celibates" that managed to attract the attention of the US military.

It's also worth noting that users who witness violations of the rules can now report them to help a victim's case. Reddit says it will use these to reduce reaction times for moderators, who are faced with the monumental task of policing large communities. The company is also working on "improved machine-learning tools" to quickly sort and prioritize user reports, but the decision making will still be done by humans.

Reddit admits the new policy's effectiveness is ultimately still dependent on moderators to do their best in ensuring they don't foster bigotry and abuse in the communities they manage. And as some Redditors pointed out in their comments to the announcement, it's not yet clear how the new changes apply to subreddits that hold strong opinions on specific topics. There's also the possibility that some users will exploit the policy and rally to report on users they want to silence.