[Serious] Americans, do you really think the USA is the greatest country on earth? If so, why?

I think that the US is the greatest country in the world, but that doesn't mean it is consistently GOOD, as no country is. This country has failed minorities and women for generations, but I feel that my faith in its potential to do better is what keeps me from jumping ship or even considering doing so. Consider how huge the U.S. is, and how stable (and relatively active) its government has been in comparison to countries of a similar size. Obviously the current political situation makes that reality seem a little distant, but it still is the reality. There are over 300 million dumbasses living in this country, and yet we all still have the freedom to express our dumbassery in all its colors, and THAT'S what makes America great (not xenophobia and sexism and the delegitimization of the structures in our government that make sure certain presidents don't fuck up our country :))

/r/AskReddit Thread