What is something most people need to hear but no one has the guts to tell people?

Whatever life you have in college does not represent adult reality at all.

Don't get me wrong, being an adult with post-college income & purpose is kickass, but I feel that college doesn't even get close to preparing you for what it is going to be after graduation. It is downright depression-inducing at times, "friends" forget you or dislike you if you do better than them, hanging-out becomes super hard, and what you considered deep and fulfilling during college kind of sucks and makes you feel juvenile afterward. Also, in my case, my female friends stopped finding me interesting the moment I got a boring job title and my hairline receded lmao.

/r/AskReddit Thread