Victim Culture Is Killing American Manhood

I grew up in rural Kentucky, where the process of becoming a man meant gaining toughness, shedding weakness, and learning how to take care of yourself and others. This was simply understood, not just by fathers and sons but also by mothers and teachers. In one grade-school incident, I got into a playground fight with another boy and knocked him to the ground. As the teacher rushed up to separate us, she demanded to know what happened. “He said I hit like a girl,” I told her. “Is this true?” She asked my friend. Rubbing his face, he nodded. “Well then, you deserved it,” she said.

And that was that.

Share