Justice

Telling a Woman to Smile Is Sexist

October 12th 2017

By:
ATTN: Video

 


Society expects women to smile all the time; and if they aren’t smiling, people tell them to. Telling a woman to smile reinforces the idea that women are inferior to men.