×

What will feminization of America look like?

Are we witnessing the feminization of America? And if so, is that a good or bad thing, or is it, like so many quiet but ineluctable trends, a combination of the two?

Perceptions of feminization come from some unexpected quarters. The (mostly) free market economist Tyler Cowen sees it as a long-term trend, going back to the suffragist movement a century ago and women’s inclination to prefer the perils of peace to the risks of war. “You might argue that I had the best of both worlds,” he reflects as he nears 60, “namely to grow up in the ‘tougher’ society, but live most of my life in the more feminized society.”

As an academic, he lives in an increasingly feminized environment. “A Generation of American Men Give Up on College” is a front-page Wall Street Journal story this week.

In postwar America, men outnumbered women on campus, with many veterans taking advantage of the GI Bill of Rights. In the post-Vietnam decades, that trend reversed. In the past five years, as higher education enrollments declined by 1.5 million students, men accounted for 71% of the drop. The sexual assault kangaroo courts set up by the Obama administration’s “guidance” probably contributed to this decline.

In the 2020-21 school year, 59% of college students, and 61% in private four-year schools, were women. And women are more likely to graduate as well, with 65% of women matriculating in 2012 getting a degree within six years, as compared to 59% of the smaller number of men.

It seems we’re headed toward a society where women outnumber men by 2 to 1 in higher education, by an even greater proportion among college graduates and even more among holders of postgraduate degrees.

But as contemporary feminists point out, that doesn’t necessarily mean that women are running things. Historically, college graduates have made much more money than nongraduates, but that gap seems to be narrowing. Men still dominate the ranks of billionaires, CEOs, and, by narrowing margins, major politicians.

It seems male and female graduates seek out different career paths. Half of law and medical students these days are women, but women choose less demanding and competitive specialties. Women are hugely dominant in veterinarian, social work and education schools. Professionally, they are large majorities in caring professions such as nursing and in what one might call the “Karen” professions — corporate human resources departments and university diversity, equity and inclusion bureaucracies.

In effect, biology keeps trumping feminism. Despite the claims of biological men who believe they are women, only biological women can give birth and nurse babies. As Charles Murray notes in his 2020 book “Human Diversity,” even women with very high IQs and unlimited career prospects often stay home to care for their infants.

NEWSLETTER

Today's breaking news and more in your inbox

I'm interested in (please check all that apply)
Are you a paying subscriber to the newspaper? *

Starting at $3.92/week.

Subscribe Today