Ultimately, the effectiveness of any moderation tool comes down to effort:
- The effort required to enforce community standards.
- The effort required to disrupt the community.
- The effort required to evade bans.
Keep the effort for moderators low and disruptors high, and you will have the capacity for effective moderation. Fail to do so, and you are playing a losing game of whack-a-mole - it will be impossible to scale a site's moderation enough to keep up with growth and persistent trolls.
Another problem of scale is that, as moderators become embittered by constant exposure the worst forms of human communication imaginable. they change too., and their ability to be fair and impartial is compromised. Some of them become toxic, lashing out at users, some of them become desensitized, glossing over clear violations, some of them start to play favorites, perhaps allowing bullying by an established commenter while showing no leniency whatsoever to newcomers.
Where this becomes challenging is that the second two points - effort required to disrupt, and effort required to evade, are also at odds with producing a lively community - if you make it too hard to post, people will become frustrated and give up, if you make it too hard to register, people just won't. Furthermore, the most toxic users always seem more determined to jump through whatever hoops you've set out for them than the audience you really want to attract.
Given these properties, many communities find it necessary to add in a reputation system. A good reputation system recognizes investment in, and engagement with the community.
Reputation cannot replace moderation, but it can supplement it in many ways, which allow moderation tools and staff to scale better.
- Users with low, or even poor reputation can be moderated by the community instead of, or ahead of, official moderators.
- Users with high reputation can be made resistant to the effects of community moderation backlash, but not to actual appointed moderation. They've proven themselves, so having a controversial opinion alone does not mean they are being disruptive.
- Reputation can be hard to get even when the goal is to make registration as easy as possible, meaning that when a troll is banned, they need to invest time rebuilding it on a new account.
However, if you aren't very careful, reputation can turn sites into an echo chamber. Reddit is a great example of this, commenters who violate the established mob mentality on a particular subreddit get vigorously down-voted until their comments are hidden. There needs to be room for well reasoned dissent in many serious forums, particularly those forums that cover controversial subjects - if your reputation system heavily favors groupthink, this won't happen.
Some sites go further in their reputation system to try to curb the negatives - Stackexchange has a cost to downvote - you sacrifice some of your own reputation every time you click that down arrow. Discourse has a "Flag" button rather than a downvote button, Slashdot requires moderator actions to be accompanied by .keywords like "Informative" or "Funny" or "Troll" or "Flamebait" indicating why a post was moderated in the way it was, and then those actions are anonymized and the moderator actions are "metamoderated" to detect moderator biases and remove "bad" community moderators from the pool.