[noun] [U] - the belief that women should be allowed the same rights, power and chances as men and be treated in the same way, or the set of activities intended to achieve this stateAs a writer she was chiefly noted for her lifelong commitment to feminism.
FEMINISM
Meaning of FEMINISM in English
Cambridge English vocab. Кембриджский английский словарь. 2012