Is there still a gender gap in today's Western society? Are women really still as oppressed as feminists argue?
As a female in today's Western society I feel that cultural awareness for women's rights has gone past the point of fairness and equity. Today's Western world feels like a woman's world. Thoughts?