Do Americans still like America?

I don’t feel proud of the US, I’m extremely concerned. We live in a declining democracy with oligarchs squeezing the working class for every penny they can. We have an increasing amount of mass shootings and our highest court has ruled that women don’t have a right to bodily autonomy. There are a large portion of Americans who only support democracy if they win and have (pitifully) attempted to overthrow the government when they haven’t. Decades of increasing political polarization, prioritizing corporate profits, and the military industrial complex have rotted the American Empire from the inside. Things are still okay for now, but if disparity of wealth, mistrust in political systems, falling voter participation, rise in political violence, rise of xenophobia, calls for civil war, and mistrust of the media are a sign of anything, it’s that the US is in very dangerous waters. Whether the nationalists want to believe it or not.

/r/ask Thread