Instead of making women think they don't need to buy a new pair of shoes for every new date, characters like Carrie almost insist that we do. Not because we need to, but because we have the right to (it is our money, right, and as independent women we can decide for ourselves), and therefore should want to. And in case you have blown your savings on 100 pairs of Manolos, don't worry, because your art gallery friend will lend you the money to save your ass because that's what friendship is for. No matter how professionally successful Dr Brennan is in Bones, she remains incomplete because she doesn't give into her feelings. We, the audience, are supposed to feel a little sorry for her, but we know deep down in our hearts that eventually she must sacrifice her professional partnership for love, because isn't that what women do? In Lost, because Kate has been going back and forth between Jack and Sawyer (more or less depending on which one is wanted by someone else), her character has been reduced to a desperate, weak runaway woman whose destiny is only tied into the man she eventually settles down with (and for the record, if she ends up with Sawyer, I will be very, very upset, because he deserves someone better!).
I don't know what the point of these types of female characters is, and I don't know why I get so angry. Am I supposed to feel empowered as a woman because my own existence feels more meaningful than theirs? Am I supposed to see tv women as role models or even feminists, because they at least seemingly do whatever it is they decide to do? Or am I supposed to identify with them because they are flawed just like everyone else? I just feel sorry for them. I can't identify, and don't even want to. Maybe I just watch too much tv. Well, at least I have Betty White.