Why Men Don't Want the Jobs Women Do (Hint: Money Matters)

It's 2017, yet apparently there are still certain careers that are considered "women's work".

It's disheartening, but it's also worth looking into exactly why that is, and what it means for women in the workforce going forward.

Historically, women have crossed over into male-dominated fields far more often than vice versa. The reason for this is not only that female-dominated careers (such as nursing and teaching) tend to pay less, but also because they are—unfortunately—viewed as lower status.

Andrew Cherlin, sociologist, public policy professor at Johns Hopkins and author of Labor’s Love Lost: The Rise and Fall of the Working-Class Family in America, tells The New York Times:

"Traditional masculinity is standing in the way of working-class men’s employment. We have a cultural lag where our views of masculinity have not caught up to the change in the job market."

The need for men to enter traditionally female-dominated fields is not only necessary for societal change, but for economic growth. The rapid shrinking of factory jobs is displacing a great deal of these (typically male) workers.

In response, fields such as nursing are tempting to attract men by appealing to traditional ideas of masculinity.

Case in point: this recruitment posted by the American Assembly for Men in Nursing, which compares nursing to the adrenaline rush experienced in extreme sports.

This isn't an ideal tactic in attracting men to the nursing profession. But if it allows "women's work" to simply be seen as "work", perhaps it's a necessary evil. The proof will be in the pudding.