Readers, after much earnest cogitations (great word!) during which my poor brain nearly perished, I concluded that American feminists, or to use a tawdry epithet "femi-Nazis" are beating a dead horse in continuing to harp upon the theme.
Simply, I don't believe women's rights in America and similar Western countries should even be an issue anymore. It's over: women now have rights in America to do, legally and without cultural constraints nearly anything they might take it into their heads to do.
They can live with men free of matrimonial trammels, dress so most/all of their bodies are exposed, commit acts of unbelievable foolishness (look up 'yarn bombing') even marry each other! (At least in my state) And abortion is also legal, which brings up a whole other issue which I shan't go into.
But basically, American feminists just need to shut up and quit whining like they are slaves or something. "Feminism is the radical notion that women are people." COME ON. When have women been treated like they're not people in this country?!?!?! (I mean as a culture, not individually) My GOODNESS!!!
This is a pet peeve of mine, as in many countries today, mainly Muslim countries, women can't even sing, show their faces, or do anything. (Not that they would even understand our viewpoint, as they are educated that way)
....But no, we rarely hear about that....which is another issue I won't discuss today. (No, I am not writing an essay)