Let's talk about Orlando

Hi there, I have a vaguely related question that I was hoping the admins could answer.

It's actually about how bans work, and how I think they've been abused, that's the issue, but let me explain why I think it's related first...

How it's related to this: I know that a lot of these censorship accusations come from an occasionally troublesome sub, /r/The_Donald. In this case there were clearly some issues, but the /r/news sub had been under the gun for some time before that. I can't help to suspect the moderators on the sub were, by and large, acting in good faith but in a poor environment. As I understand it they were rushed by bigoted and otherwise lower quality comments and posts. It's not hard to see some of them getting frustrated and just shooting down everything. It is my opinion that this comes from a toxic aspect of the current reddit community, let me emphasize the issue is not /r/The_Donald, but they are exploiting the issue, and I think that has created some toxicity.

So what's the Problem? Bans from subs, as they are currently set up, forbid a user from interacting with the sub. The user can see it, but can't interact with it. It turns two-way communication into one-way communication. The alternative I'm suggesting is a complete closure of communication upon banning. But, let's back up first: Why was it built like this in-the-first-place? The original set up is actually pretty well thought out it keeps the incentives on creating a new account low and allows the banned user to do what the public at large already can. So what's the problem? It's a Train, Censor, Spam Tactic that, to /r/The_Donald's credit, has been executed flawlessly on an ideal topic. What do I mean by each of those? Training: The biggest problem is the use of bans to cultivate an environment of conformity. This isn't training like you'd train a dog, I'm not insulting /r/The_Donald's users. This is how you'd train children, employees, and other intelligent individuals. Bans are treated as trivial, but highly annoying punishments, accompanied by unbanning events. Mass Censoring: To compound that issue once, there is no descent in any form on that sub -- regardless of how you feel about their politics. With total domain over a topic, this enough to create a very toxic environment. Pick the right topic, and you can grow and mold that community in surprising ways. Spamming: To compound that issue twice, they've made it so you can't leave the community. They've taken /r/all. It's not the front of the page of the digital news paper, and maybe an algorithm change would help that, but the users have taken to informally brigading other popular subs and creating other popular subs (/r/HillaryForPrison), so will it be sufficient? I'm not so sure, and I don't think it really hits core problem.

Okay, so how do we fix it? Clearly bans have to be used responsibly, and unfortunately we can't always entrust the moderator communities to do that, so the admins have to play the role of game devs and design a system in which they choose to act responsibly. There are a lot of ways to do this, and I don't want to present my preferred solution as the only solution, so real quick here are some alternatives to get everyone thinking:

  • Paid moderation, jk lol.
  • More powerful Community Managers. Community Management is a very hard job. I'm sure reddit has some people who's job is to interact and deal with the community, and I bet that roll is seriously under appreciated. Balancing good public relations, personal biases, and a broad spectrum of awareness for what's going on is beyond difficult. It's easy to call things out in hindsight, but if reddit had a community manager who could have put /r/The_Donald down, maybe helped prop up /r/AskTrumpSupporters in its place, when they started abusing the ban system I think a lot of things would be different. Maybe they didn't have the tools to see all this, maybe they were afraid of bad press, or maybe it's just a shit happens sort of thing. Like I said hindsight's 20/20
  • Make bans permanent. It works off the same base concept I'm going to suggest because mods (not wanting to lose their community) will be more lenient, and users (also not wanting to lose their community) will be more considerate, but I think it's a bit too harsh on the moderators, and won't really hold the users accountable for their actions. I think it would result in people just making a new account every few years and ignoring the rules all together.

Finally, my preferred solution: A complete closure of communication upon banning. This would prevent the Train, Censor, Spam tactical exploit at its core by preventing moderators from cultivate an environment of conformity. In-fact, it does just the opposite. It encourages civilized, reasonable communication because neither the moderators nor the users want to lose their community. However, it might seem counter intuitive at first because there are some clear concerns that need be addressed. What prevents a user from just logging out to view the community? Nothing, but it's annoying, so that's still an incentive. Wouldn't that simple logging out defeat the purpose of the whole change? No. Most people never change the default settings, so this still works for suppressing the exploit. Wouldn't this result in more new accounts to circumvent the rules? As far as interacting with the community goes this is not significantly different from the current model, and sense they can just log out to view it I don't see any serious risk. If you are concerned you can make message that goes to accounts logging in from the same ip reminding them of the rules and the consequences of breaking them. The elephant in the room: Don't you just want the gold users' blocking subs ability? I'm currently running on gold for the first time, but most days I made my own userscript, so not personally, no. But this is two questions: 1) reddit has to make money off this, so how do they sell it to the boss? Let the gold users keep the status quo. They pay to see, but not interact with, the content they got themselves banned from. High quality content, and higher incentives to buy gold. That said, a highlight feature of gold is filtering /r/all, and it is clearly something people would want to get themselves banned for, so how would you prevent it? There are like a billion great ways to do this and you can really get creative with this, but some of my personal favorites:

  • A dunce cap in their Trophy Case if they're banned from more than X subs.
  • Temporarily silence users who get banned from more than Y in short time T for long time S. (a day, and a week)
  • A public shadow ban if they're banned from more than Z subs (people only see their comments if they go to their user page, and they get to know they were shadowed, maybe even give them a grey name from then on.)

And that's it! Pretty simple if a user is blocked the content no longer appears for them. Can't even see the sub when they go to the page. Will have to log out. That's my pitch.

If you read all this, thank you for your time, I appreciate it.

/r/announcements Thread