Americans of Reddit, what should change in American culture?

It seems like it's every few months another one of these "what's wrong with America?" Posts show up.

Nothing is really wrong with America. We've just forgotten who we were. America has some bad things it can feel ashamed of. The endless wars, slavery, how we treated the natives. There are many things we can look at throughout our history and hate ourselves for BUT what America stands for and the idea of America is what is so awesome. You can be a nobody from anywhere in the world and come to America and lead a successful, fruitful life with an abundance of freedom. I love my country. I bleed red, white, and blue. I just hate the people in charge.

/r/AskReddit Thread