We've been discussing feminist art theory at college. I know. One minute I'm cleaning out the mysteriously long hairs in the shower, the next I'm pondering the whys and wherefores of gender on artistic practice and criticism. Get me.
Anyway...after a particularly rantful morning yesterday about feminism and its relevance today, I'm quite confused. Did feminism work so we don't need it any more or do we need it even more now?
If I look around, it doesn't seem that much has changed. We've still got eye-seeringly sexist adverts (that one about the bloke drinking 0% sugar cola and he gets some woman pitching up in a bikini to fix his truck - with her friend....perrrrrrrleaaaaaaase!, for example). It seems to me that feminism has been repackaged so that as long as sexism is done with a wink and a nod, it's OK. We ladies should just get over ourselves.
Were/are you a feminist?