I grew up with my grandmother always close by. She was my primary care-giver, a lot of the time. Grandma was a romance book, soap watching, bon-bon eating divorcee in a house dress. She didn’t believe ladies should like sex. Have you ever read a Harlequin romance novel? Lot’s of sex. Soap operas also have lots of sex. From this conflicting backdrop, I formed my views about that most intimate of acts.
The prevailing philosophy on women’s sexuality was that we didn’t have one. Maybe that should be amended to ladies didn’t have a sexuality. This went on for a thousand years or so. Then we got the pill and things started to change. Practical need has always shaped morals. Letting women think they can run around and have free sex leads to lots and lots of babies that society would not have been able to support, so we were told we weren’t allowed to like it. Given the kind of man that keeps a woman down is probably not much for giving orgasms, they probably weren’t wrong.
So women get the pill and it’s the 60’s and everybody is rebelling. Women are having sex and liking it with no repercussions minus a few venereal diseases. Now, shaping my life is a thoroughly repressed sex-hating grandmother, and her daughter, my mother who embraced the 60’s pretty thoroughly but was brought up in a repression heavy house. Grandma didn’t talk about sex. Mom talked about it a lot. Romance novels and soap operas with their mixed messages further clouded things. I could have sex with anyone I wanted provided I used lots of contraceptives and then feel guilty and slutty later!
I’m 40 and it took me quite awhile to work things out. More recent generations of young women are out there with no seemingly no hangups. Women are proudly having sex with whomever they want and declaring themselves free. Sex and the City, 50 Shades of Grey, Magic Mike are all demonstrative of the changing acceptance of society toward women’s sexuality. We’re reveling in it. We get to be as bad and objectifying as men ever were.
But is this true to what women want or are we mimicking men’s behavior as represented in popular culture? Are we just be reactionary? Women (and too often, girls) are owning their sexuality but it’s possible this will have long term damaging effects on self-esteem. There are still too many conflicting values coming at us. Are we women who own our sexuality or are we kidding ourselves, throwing ourselves at too many things at once? What will happen with the next generation?