Feminist Art Definition

Feminist Art refers to the art movement which emerged in 1960s where women artists sought to bring women’s perceptive in artworks. The goal of feminist art was declared to “influence cultural attitudes and transform stereotypes” by artist Suzanne Lacy. Feminist Art movement succeeded in getting representation in galleries and exhibitions which were denied to women based on their gender.
Log In