Member-only story
I Thought I was Body Positive
The sobering moment when you realize you’re not as ‘woke’ as you think you are.
Body positivity is described as a social movement rooted in the belief that all human beings should have a positive body image while challenging the ways in which society presents and views the physical body.
We all know some of the common consequences of a negative body image. They include things like eating disorders, anxiety, depression, relationship violence, or drug abuse.
For a long time, I thought I was on the positive side of this movement; in a lot of ways, I still am. I hate it when someone criticizes the way another person dresses because they supposedly “don’t have the body” for it. I call out people who say they’re “just concerned about their health” after they’ve criticized a fat person but will watch me eat two bowls of instant ramen at 1 am. Because apparently my crazy sleep schedule and bad eating habits aren’t enough to cause ‘concern over my health’. I support people, especially women, in their journey to be comfortable in their own skin. Wear that ‘risky’ outfit, flaunt what you have, pay no attention to those people giving you a hard time.
And yet that little advocate in me comes crashing down when I look at my own reflection.