Why does it seem the U.S. is far behind in portraying interracial relationships in movies/television as compared to U.K. and other Europe countries.

Why!?

Systematic segregation and dismantling of black communities and intellectualism. And the continuation of slavery through the prison system and a one sided justice system. There was no way for black folk to win in the U.S.

Black businesses were often ran out of town. If black people stayed to themselves had their own neighborhoods, bushiness and schools and they started to out pace or out prosper their white counterparts white folk from the town would literally come and burn or bomb the neighborhood down.

Accused, lynched and threw as many black men in jail as they possibly could (they're still doing that), Denied as much work and education as they could. Wouldn't let black people vote and cheated during elections when they could. The U.S. government experiment on black men, assassinated it's leaders, pumped cocaine directly into black neighborhoods.

When black people were allowed to integrate or assimilate into white towns, white flight happened (Jobs and business left with them) or blacks were alienated, under served, denied access to places and mistreated.

Now I wouldn't say Black men from the UK are better educated I think they get better media representation and Europe overall has better social programs to combat disadvantages, however I've heard many black men from the U.K. speak on the racism they experienced growing up and a lot of it came from academia. I think it has more to do with U.K. vs U.S. culture rather than U.K. men seeming more educated than their U.S. counterparts.

I'm in my mid twenties, though I'm quiet proud to be black and wouldn't change it one bit I don't feel that it's my whole identity is based on my blackness and I too feel like an American above all else. Just plain old American nothing with a dash in front. My family has been here just as long or longer than the majority of white Americans.

Now I think this is the best time in history for black men and women in the U.S. to attempt to integrate into the rest of American society. With black culture being in the forefront of entertainment and media more and more young white Americans are adopting it (and attempting to understand it) and cities being gentrified at as fast pace now. It's prime time to dive in and mix living and working situations and relationships before those young white Americans find jobs and start young families and pick their parent's and grandparent's conservative attitudes back up and don't want to have anything to do with black poverty, violence and mis-education and everything becomes extremely segregated again. If that happens things I think things can be better. It won't completely get rid of racism but maybe young, black, ethical, innocent, college graduates can finally get hired over white men and women with criminal backgrounds.

/r/NoStupidQuestions Thread