Why are the English seemingly universally hated? In light of recent events, the French football riots, why does it seem that everyone hates England and the English? Is there any truth in this? And if so what are the reasons?

Being apologists doesn't do anything to help people move forward, so that's not what I was implying. I just mean that the overall narrative of British history throughout the world seems to be one of building nations up, ignoring the violent histories behind it all and calling the British friendly peacemakers. I think that we have to acknowledge that our predecessors (I'm Scottish, I feel comfortable saying this) did shitty things, and that we as White/British people have a duty to fix a lot of the mistakes people before us made-- whether that's by pushing for equality, doing our best to help refugees coming from places our people helped fuck up etc. It's not about hating yourself for what was done in the past, it's about recognising the shit that British people did, recognising how we/white people are benefitting from the shit that British people did, and then taking steps to help make everyone equal and forcing nations to take accountability.

One of the simplest ways we could do that is by pushing for redesigned history courses in schools-- non-white countries' histories only ever seem to start when they've been invaded or colonised. I was reading the Canadian government's page on the nation's history (the version up under Harper), and First Nations people were literally only mentioned for a few sentences after detailing how the British came and settled, and then it was all about British triumphs. They ignored the genocides, the wars, the policies during the 1800s designed to have the fewest East and South Asian immigrants possible, etc.

I agree that in some ways, other countries are paying for what they've done in the past (Germany is a perfect example), but Britain is being excused from a lot of shit which isn't fair. Germany has been demonised after Hitler in every medium possible (even in Looney Tunes cartoons), Britain never had to deal with that. I don't know how America is paying for it, but I do agree that once a nation acknowledges their past and makes an effort to fix things and move forward, they do better. I'm of course talking about Germany.

/r/AskReddit Thread Parent