Feminism is a word that implies women who are advocating for equal rights in society, and this term dates back around a century to a time when women could not vote, own property, or engage in business in most cases. Equal rights for women has become expected in the USA, but in many countries around the world women are still considered inferior or even the property of male relatives and they have few if any rights.