*****
Readers, after much earnest cogitations (great word!) during which my poor brain nearly perished, I concluded that American feminists, or to use a tawdry epithet "femi-Nazis" are beating a dead horse in continuing to harp upon the theme.
Simply, I don't believe women's rights in America and similar Western countries should even be an issue anymore. It's over: women now have rights in America to do, legally and without cultural constraints nearly anything they might take it into their heads to do.
They can live with men free of matrimonial trammels, dress so most/all of their bodies are exposed, commit acts of unbelievable foolishness (look up 'yarn bombing') even marry each other! (At least in my state) And abortion is also legal, which brings up a whole other issue which I shan't go into.
But basically, American feminists just need to shut up and quit whining like they are slaves or something. "Feminism is the radical notion that women are people." COME ON. When have women been treated like they're not people in this country?!?!?! (I mean as a culture, not individually) My GOODNESS!!!
This is a pet peeve of mine, as in many countries today, mainly Muslim countries, women can't even sing, show their faces, or do anything. (Not that they would even understand our viewpoint, as they are educated that way)
....But no, we rarely hear about that....which is another issue I won't discuss today. (No, I am not writing an essay)
*****
I agree. In fact, it seems like our culture has shifted into so strong of a woman's "movement" that men are now being treated unfairly!
ReplyDeleteFor example, my husband is about 1 of 3-4 men teaching at the local college in our area. All the other teachers are females. Even the President and Deans are female (and it is NOT a woman's-only college!).
Also, men have to pay more for car insurance and are forced to register to serve in the army.
So all this boo-hooing from the feminists gets on my nerves, too. If anything, men are treated as inferiors...not women!
--Genipher