LGBTQ groups have criticized the findings from a new "gaydar" report, but the authors stand by their research.
Can you tell if a person is gay or straight simply by looking at their face? Researchers at Stanford University say they’ve developed a computer algorithm that can make a very good guess. In a new study, the artificial-intelligence program accurately identified men as gay or straight 81% of the time, and women 71% of the time.
That’s better than we humans can do with our own eyes and brains. Using the same photographs, study volunteers could only predict men and women’s sexuality 61% and 54% of the time, respectively. Previous research has also suggested that people’s assumptions about sexual orientation—based on looking at faces alone—are correct only slightly more than half the time.
But the new research, scheduled to be published in the Journal of Personality and Social Psychology, is not without controversy. Shortly after media outlets reported the study last week, two prominent gay-rights groups—GLAAD and the Human Rights Campaign—released a joint statement criticizing the research and voicing concerns about its potential implications.
Those groups call the study “dangerous and flawed research that could cause harm to LGBTQ people around the world.” The researchers behind it have responded by defending their findings and their motivations for publishing them.
But back to the science in question here: How can a headshot alone reveal clues about sexual orientation?
For the study, the researchers analyzed hundreds of thousands of publicly available photographs from profiles on a popular American dating site. When they narrowed those down to photos with faces of sufficient size and clarity—and made sure that men and women, as well as gay and straight people (based on information in their profiles), were all equally represented—they had a sample representing nearly 15,000 members.
The researchers fed most of these images into a software program that created “faceprints.” The program looked for consistencies among those who were interested in same-sex partners. With this information, the software developed a predictive model, which the researchers then tested against other photographs not included in the initial batch.
When shown one photo each of a gay man and a straight man, the program was able to identify which was which 81% of the time. When five different photos of each man were included, accuracy improved to 91%. For women, the model was slightly less accurate: 71% accurate with one photograph and 83% with five.
Study authors Michal Kosinski, PhD, and graduate student Yilun Wang, say this model supports a hypothesis known as the prenatal hormone theory, which suggests that sexual orientation is influenced by levels of testosterone, estrogen, and other sex hormones a baby is exposed to even before birth—factors that can also influence facial traits, structure, and behaviors like grooming style.
Gay men, for example, “are predicted to have smaller jaws and chins, slimmer eyebrows, longer noses, and larger foreheads,” the authors wrote in their paper, while “the opposite should be true for lesbians.” Consistent with this theory, gay faces included in the study tended to be "gender atypical."
The study has several limitations, including the fact that it only included white participants—the only racial group represented widely enough in the potential sample, the authors say. It also classified people as “gay” or straight” mainly based on who they were “interested in” meeting on the dating site, even though a person’s sexual identity may be more complicated. (The authors purposely did not include anyone interested in both genders, or who described themselves as bisexual.)
GLAAD and the Human Rights Campaign have criticized these points, along with others. They also worry that media outlets reporting on the study will inaccurately claim that a computer can tell, based on one photo, whether a person is gay.
“Technology cannot identify someone’s sexual orientation,” said Jim Halloran, GLAAD’s chief digital officer, in the groups’ joint statement. “What their technology can recognize is a pattern that found a small subset of out white gay and lesbian people on dating sites who look similar. Those two findings should not be conflated.”
In other words, the statement notes, it’s not surprising that openly gay white people of similar ages who use the same dating site post photos of themselves “with similar expressions and hairstyles.”
The study authors acknowledge these shortcomings; they write in the paper that photos on a dating website may be especially revealing of orientation, and that in a real-life scenario—as opposed to a lab setting in which gay and straight people are compared literally head-to-head—the technology would likely be less accurate.
They also say they're surprised that gay-rights groups are so critical of their research, since it serves to support the idea that gay people are "born that way" and can't simply "decide" to be straight, reports the Guardian.
To get our best wellness tips delivered to you inbox, sign up for the Healthy Living newsletter
Finally, the authors address concerns that the technology could be abused by being used to inaccurately identify straight people as gay or to out closeted gay people. “As the governments and companies seem to be already deploying face-based classifiers aimed at detecting intimate traits, there is an urgent need for making policymakers, the general public, and gay communities aware of the risk that they might be facing already,” they wrote.
They point out that they “did not create a privacy-invading tool,” and that their findings offer no advantage to any other groups working to develop one. “We hope that our findings will inform the public and policymakers,” they concluded, “and inspire them to design technologies and write policies that reduce the risks faced by homosexual communities across the world.”