A lot of people call themselves feminists these days.
And yet there’s something that’s always bothered me about what some of them (not all of them; in my experience, it’s more or less never ‘all of them’) seem to be saying. Things like this:
"Women have to live in fear of men. Women should be afraid of men.”
And then the condemnation of men, fully half of the species, is labeled empowerment. But is it really? When people say certain things, I find, part of the real meaning lies in what they don’t say.
When I hear “Men are predators”, I also hear the whispered “And women are prey.”
When I hear “Men have all the power”, I hear the unspoken echo: “And women are powerless.”
And now? I’m getting to be convinced that a lot of people don’t really care about empowerment at all. Not as much as they care about anger. Because the weaker and more powerless people feel, the easier it is for them to be angry - and to feel justified in their anger.
As for me? I think women - at least in a lot of places, at least in places like the US and the UK where most of the people who say these things seem to be coming from - are stronger, far more powerful and in control of their lives and their fears, than such people would ever care to admit.
So now I ask: Which of these views is truly empowering? To teach people to embrace fear - or to teach them to rise above it?