Okay, sorry to jump on the bandwagon here, but I am so curious!
I went to the movies today with my mom (it is her birthday!), and there was the longest line I had ever seen for the Sex and the City movie! The audience was primarily young women.
I have seen the TV shows, and do find them entertaining, sometimes even thought-provoking. I became disappointed when the focus narrowed to what they are wearing and what club they are getting into, rather than the relationships themselves (but, I "get" that as part of the show).
What I don't "get" is is all the media hype, and it does seem to be more about the fashion and less about the women's relationships and the IDEA that WOMEN can TALK about SEX..it is no longer "forbidden"! I am all for that, but now I am questioning the motives behind this movie and the movie goers. Am I reading too much into this...is it just a movie?!
I would love to hear your thoughts on the movie, whether you've seen it or not, and what do you think about these types of movies...has society advanced with the notion of women being free to talk (and have) sex, or is it an unhealthy portrayal of intimate sexual relationships?