Idea: Provide links to mental health and anti-radicalism resources in the upcoming 'content warning' pages on subs that violate a common sense of decency.

PROPOSAL HERE

I guess what I was trying to get at in my rambling OP is that

  1. Hate subs (i.e. subs that hate a group of people by virtue of a fixed characteristic that they possess) fit the definition of 'hate group'

  2. The Reddit platform facilitates the spread of bigotry and is a hothouse for hate group recruitment

  3. Hate sub membership and recruitment on Reddit matches hate group membership and recruitment elsewhere

  4. There are well-studied underlying reasons for why people are vulnerable to hate groups

  5. There are researched-backed ways to prevent people from joining hate groups and 'deprogram' current members

  6. The 'content warning' page that will be put on hate subs can provide an opportunity to give vulnerable people resources that address the risk factors and provide information on various hate group tactics and debunk common hate group misinformation

I didn't have room to cover all of that, but oh well.

My proposal is to add some sort of link on the new 'content warning' page that uses the information provided in the links of this OP (and other information of course) to help those who are vulnerable to hate group recruitment or have already adopted a hateful ideology to recognize the features of a hate group, identify its propaganda tactics and how it directly preys upon the insecurities of lost people, and provide resources that can help tackle some of the issues that draw people to hate groups (e.g. information on how and when seek mental health care, links to assistance or advice subreddits, social support groups, positive and researched-backed self-help information, crisis hotlines). This should be done in a way that appeals to the average, skeptical, computer-savvy youth- the information provided should include links to scientific literature that supports the points made, the tone shouldn't be patronizing or try too hard to connect to the audience with slang or memes, and the content shouldn't 'shame' those who have been sucked into hate groups already because antagonizing people in those communities only makes them feel attacked, which confirms their victim narrative and causes them to retreat deeper into the group and its ideology. I would wager that a good percentage of the people who participate in hate subs were pushed over the edge or strengthened their beliefs because a good deal of redditors on the opposite political spectrum have reacted so hard against the first side's extremist rhetoric that they dismiss or ridicule the notion of any legitimate issue that the hate group's demographic may experience. This further alienates the even moderate members of the hate group's demographic and pushes them closer to radicalization by illustrating that those in the other group are hateful. Members of in-groups are more likely to ascribe the bad actions of a few out-group members to the out-group as a whole while ascribing bad actions performed by fellow in-group members to the individual alone.

Sorry for the blogs. In short, my suggestion is:

Provide hate-group recruitment prevention and mental health/legitimate self-help information on the new 'content warning' pages with a tone that appeals to Redditors and connects with those vulnerable to hate group recruitment in a manner that is genuine and not patronizing.

/r/ideasfortheadmins Thread