What made you lose hope in the United States?

A plethora of things. Constant school shootings. Cult like obsession with guns beyond practical use like hunting

No accountability or consequences for the rich The demonizing of protesting peacefully while in the same breath defending and uplifting white supremacy by some people.

The fact that a majority of Americans are a medical emergency away from having to choose between paying the bills or eating I'm hard pressed to find at the very least 10 things in a Walmart or any other store for that matter that doesn't say " made in China"

The fact that in my lifetime, wages have pretty much stayed the same but the price of everything constantly goes up yet those in power vote themselves a pay raise every year and even if you pick yourself up by your bootstraps like they love to say you end up burnt out and living to work with not much beyond that to show for it.

/r/AskReddit Thread