photo credit
Feminism [noun];
"The belief that women should be allowed the same rights, power, and opportunities as men and be treated in the same way, or the set of activities intended to achieve this state."
Cambridge English Dictionary.
I guess thats a pretty straightforward definition of what feminism is despite the common misconception that...