Sexist.
Women beat men too.
Women also sexually assault men.
Not all men beat women. Not all women beat men.
Etc.
Feminism used to be about gender equality, when did it start becoming about hating men?
I consider myself a feminist in its classic form (I imagine most people would). But can't stand this new form of feminism.