This seems completely unsurprising. To be clear, I'm an old-school liberal myself. But without any judgement about the justice of various political and social trends, women's huge relative rise in status in the last few decades implies a corresponding decline in relative status of men. (That's the mathematical meaning of "relative" not a political opinion.) Women now do better than men in every positive measure from pre-school through graduate school, and men do worse than women in every negative measure, e.g. performance in school, probability of incarceration, drug and alcohol abuse, etc. (The only area in which boys and young men do better than girls is team sports.) It's no mystery why--look anywhere in mass media or in education. Men are despised and told they are the root of all problems. "White male", "Cis white male" are literally pejorative terms, and "toxic" is by far the most common word associated with "masculinity." Even twenty or twenty five years ago, when my boys were in elementary school and much too young to have political opinions, they often asked me, "Pops--why do the teachers hate boys?" It was a serous question, and they weren't making it up. There is a big slice of liberal culture that makes it clear to young men that masculinity itself is to be despised, and the same cultural forces celebrate women constantly, and praise being female as an inherently good trait, in explicit contrast to masculinity. How could young men in this climate **not** see liberalism as their enemy?