What is the worst thing about living in America?

Dealing with about 80% of my fellow Americans. The country itself is pretty good overall, based on the geography, any society that set up shop here would have eventually become an industrialized powerhouse nation free of most problems in nature. That being said I don't like that a good majority of people here use the founding principals of America to be willfully ignorant of anything they don't like or care about though, myself included to some extent. This attitude is at least partially responsible for the polarism in our politics and many of the ideological rifts across the country. Willful ignorance is also part why our literacy rate is declining for the first time since the 1930s, and part of why most of the rest of the world hates us for being self-obsessed and obnoxiously "charitable". I could think of plenty of more issues that are related to the phenomenon of willful ignorance here, but I feel like I hit some of the most important ones.

/r/AskReddit Thread