Before the Johnny Depp/Amber Heard case, why did the media refuse to believe that a female could abuse a male?

As sad as it is, women have always been labeled as the victim by the media because of a mix of sexism and old world view whereas most men are actually embarrased or ashamed that it happened to them and by the time they come forward, it's usually to late. There are far more cases where women get abused because they have support and they come forward. Men don't do that usually. I don't mean to offend anyone with this but in general women will always be seen as victims and men will be seen as abusers. It's gonna take a lot to change that view given that over half the time, it's right. Hopefully men will come forward more and the media won't back up the women always. Again, not meant to offend anyone with this

/r/teenagers Thread