What's a common misconception about a historical fact or figure that is easily debunked?

I don't think I want to be offended anymore than when I hear words which have been deemed derogatory, or offensive. Plus, there is also an ongoing, at least in academia, debate about using the term "western" to refer to mostly Europe. So, I don't think I'm overreacting. I hope I'm not.

I think using the "that's what they call themselves" is a poor defense, because after all, words like "nigga" or "nigger" have been debated on that same ground.

Also, like I said in my previous comment. I wasn't implying that you, or anyone who uses the term "west Indies" means anything derogatory by using it.

My issue is with the word itself. After all, what's in a word, right?

To sum up my point, I'll quote an article I read on the Huffington post the other day on the power of meaning in words:

"I feel uncomfortable even putting into print. Nigger. Wetback. Red Neck. Cracker. Chinks. Spicks. These words are pregnant with incredible potency. These words do not have a history of tolerance, of acceptance, or compassion. No, these words tell the story of oppression -- of an American landscape of racism and mistrust. Without our past, these words have no negative connotations. Yet within our historical landscape of slavery and shame, these words have powerful implications."

http://m.huffpost.com/us/entry/423969

At the end of the day, I feel like I have a little bit of justification for my emotion towards "West Indies" . Yes, perhaps it's biased because I grew up under the shadow of America's "neo-imperalism" or even perhaps because there's a HUGE gap in Costa Rican history, that makes it seem as if our identify either didn't matter before Columbus, or worse, doesn't deserve to exist.

But perhaps you are right, I just what to be offended.

/r/history Thread Parent