I was a bit horrified to read this story on the Newsweek site today about a new picture book for children about why mommy gets plastic surgery... because it makes her prettier!
The argument for this book was made by its author, a plastic surgeon, who sees increasing numbers of clients bringing children into his office. At first thought, it seems like a great idea to explain plastic surgery to children since the prevalence of people, especially women, getting plastic surgery is increasing every year. However, this book does not talk about why mommy doesn't feel pretty (increasing demands from society and the media) or why her only option is surgery. Instead, it explains that plastic surgery makes mommy have a better tummy, a prettier nose, and (without explanation) much larger, perkier breasts.
I am not a mom, so I can't say that I wouldn't ever want to or actually get plastic surgery. I can say, though, that I think its troublesome that our society makes women feel that their post-baby bodies are unacceptable and less-pretty when compared to their pre-pregnancy counterparts. I think that if we live in a society where we skirt around explaining body image and self esteem to children and instead justify capitalistic remedies like plastic surgery that we have a lot of problems.
To sum it up: I'm really angry that our society sets out to make pretty much everyone, especially women-identified folks, feel insecure about their bodies and the natural changes that our bodies can make.
Read that article. How do you feel about it?