[Serious] What societal things did Covid-19 change permanently even after we recovered from the pandemic?

I know this'll get a lot of hate because Reddit is very openly anti-Republican, but I'll post it anyway.

A majority of the armed forces are Republican. Since Biden took office there has been a huge push to wokify the armed forces and change the demographic to consist of more Democrats. I come from a very Republican area where they're running ads that go against the majority's values in my area (Christian, bible belt) and are discouraging a lot of the young men here from joining the armed forces.

The military ads today typically talk about diversity, inclusion, trans rights, equity (not equality), and other woke topics. Let me be clear, when trying to get young men and women to join the military you need to convince them this will be good for them and they will enjoy it and it's worth potentially losing their lives for. Past ads capitalized on education, adventure, glory, pay, service, etc. Most military men will tell you their motivation is the 3 P's, pay, pussy, and/or promotion. The current ads are geared toward a very different demographic. How do maternity flight suits encourage young men to fight?

Why do I keep talking about young men? Because they're the only ones who want to fight and kill and are willing to do it. As they get older they mellow out.

My source is first hand experience.

All this to say it's much less likely there will be a coup as we discourage young men from joining the armed forces.

/r/AskReddit Thread Parent