Does America really hate black people?

No problem. Some of us have been jaded by the trolls of late.

I'll give a non-snarky answer:

I was born and raised in south Louisiana. Growing up, I'd never thought about racism until I moved to another southern US state (not going to call them out) for a few years when I was still a kid. The town seemed to be run by the KKK. I'm white, Cajun to be specific, so this was all new to me. My mom was a big Southern Baptist back then and went out around town trying to draw people to her church. She didn't discriminate, everyone was welcome. Unfortunately, that wasn't the case with the locals who very adamantly told her that black people were not welcomed (it was a thinly veiled threat to her with a tight grip on her arm and seething anger).

Going to middle school and high school in this town was an experience in itself. I was discriminated against for being a Cajun by the rednecks (I'm white as can be, blue eyes and everything), being not like them. I'd heard every manner of racist joke, racist name calling, and I'd seen assaults on minorities in schools. Keep in mind this is the late 90s, so it's very recent.

I've lived in a lot of states as of now, so I'll share my experiences about them:

The Pacific Northwest has almost no racism from what I've seen. Everyone gets along. I never felt more welcomed by my Mexican neighbors in Phoenix, and they were welcomed by the other neighbors and myself. It was like a big happy family of shared meals and beer. I didn't witness anything in the Dakotas, it's a nice place, the people are a bit strange (could be the isolation) but, other than that, very friendly and warm to everyone. South Louisiana is really different from other areas, everyone talks to each other, everyone is friendly -- people are just people, no matter their skin color, we're all in this together (that's the feeling I get here).

/r/AskAnAmerican Thread Parent