Radical feminism
Feminism is the belief that men and women should have equal rights and opportunities. It encompasses social, political, and economic equality. A belief is one thing, putting it into practice is another.
Women have fought for equal rights, and, clearly, have made some progress. But, inequality between men and women still exists, and that has given rise to radical feminism, an outgrowth of the 1960s women’s liberation movement. Radical feminists believe inequality stems from patriarchy and that male supremacy must be abolished in economic, legal, social, and political arenas.
Hey, guys, . . . don’t worry, it’s safe to read on. Radical feminists don’t oppose men, just patriarchy. Here are some more modern feminist terms to help you navigate this ongoing arena.