Smiling has always been a sign of comfort, especially as a warm gesture passed from one person to another.
This gesture becomes uncomfortable, however, when it is demanded by a man to a woman. Which brings us to question: why are men always telling women to smile?
“Why don’t you ever smile?”
“Just smile for me.”
“You’d look better if you smiled.”
While baring one’s teeth is a natural expression of happiness, confrontations like these not only set an uncomfortable tone, but also create a dirty purpose for the innocent smile.
Popsugar recently published a survey study that showed about 87% American women between the ages of 18 and 64 have been sexually harassed by men on the street. The data suggest that women are highly more likely to be harassed in public than not.
The science behind the smile was discussed at Scientific America by Frank McAndrew, professor of psychology at Knox College. He shared that smiling is believed to have first originated within mammalian evolution.
“In primates, showing the teeth, especially teeth held together, is almost always a sign of submission,” said McAndrew. “The human smile probably has evolved from that.”
Dr. Janice Porteous, a philosopher professor at Vancouver Island University, spoke to Livescience about the evolution of smiles back in 2012. Her findings concluded that higher primates, which include monkeys, apes, and humans, use smiles in response to perceived threats of dominance and aggression.
“The expression seems to deflect the dominant’s aggression,” Porteous explained. “So it’s a sign of submission, non-hostility or appeasement, resulting in the dominant leaving them alone.”
Previous studies on gender differences have shown that that women tend to be socially weaker and less dominant than men since their tendencies to smile are much frequent. However, according to Marianne LaFrance, a psychology professor at Yale, only about 20% of smiles are genuine.
A man telling a woman to smile, no matter how innocent his motive, is inherently a dictatorial act. Being commanded to smile in public is rarely received as flirtation, but more so a threat, especially when that command is being yelled out from a stranger on the street.
Smiling should be a natural act that makes a woman feel good. Men tell women to smile will say that they just want them to be happy, but studies have revealed that when it comes to forced smiles, things aren’t always as simple as baring teeth.