“In my own view, feminism is the claim to the truth that women have moral, political and sexual agency and should have commensurate influence in the world. Feminists are those that speak that truth and influence social mores, culture, and public life.”
Read More