Study claiming AIs can tell sexuality based on face shape called into question
A controversial study which claimed that artificial intelligence machines were able to work out someone’s sexuality from their face has been called into question.
Research at Stanford University last year found that human faces have subtle differences which can denote sexuality, IQ and even political views.
Experts Michal Kosinski and Yilun Wang claimed that intimate traits in a face can be picked up by machine, which humans are not capable of spotting.
According to The Economist, a programme developed to figure out people’s sexuality from just their faces performed with remarkable accuracy.
The study was based on unproven theories that hormonal differences in the womb can lead to different physical attributes.
But a second look at the study by researchers at Google suggested that there is simply a difference in how straight, gay or bi people take selfies.
Margaret Mitchell and Blaise Aguera y Arcas from Google, and Alex Todorov from Princeton, carried out the new research.
They said: “The obvious differences between lesbian or gay and straight faces in selfies relate to grooming, presentation, and lifestyle - that is, differences in culture, not in facial structure.”
“Heterosexual men tend to take selfies from slightly below, which will have the apparent effect of enlarging the chin, shortening the nose, shrinking the forehead, and attenuating the smile,” researchers found.
“This [angle] emphasises dominance - or, perhaps more benignly, an expectation that the viewer will be shorter.
“On the other hand, as a wedding photographer notes in her blog, ‘when you shoot from above, your eyes look bigger, which is generally attractive – especially for women,'” they added.
The Princeton and Google researchers used selfies of themselves to demonstrate their theories about the angle of photos taken.
The original researchers at Stanford used 130,741 images of 36,630 men and 170,360 images of 38,593 women downloaded from a popular American dating website.
The images – half of which were of gay people, half heterosexual – were put into a computer system, VGG-Face, which produced a string of numbers to represent a person’s faceprint.
The next step was to use a predictive model, known as logistic regression, to find correlations between the features of those faceprints and their owners’ sexuality.
The computer’s conclusions were then compared against the sexuality listed on their dating profile.
The model accurately predicted the person’s sexuality in 81% of cases, according to the Stanford research.
When shown five images of a person, the artificial intelligence system managed to predict the person’s sexuality correctly in 91% of cases.
The model performed worse with women, telling gay and straight apart with 71% accuracy after looking at one photo, and 83% accuracy after five.
The accuracy far outstrips that of humans – who were able to correctly guess sexuality from a face pic in just 61% of cases for men, and 54% for women.
More from PinkNews
The model performed less well when given a set of images that more accurately represented society – with seven in 100 faces presented being of gay people.
However, when asked to rank the 10 faces it was certain were gay, nine of the 10 were accurate.
Experts fear such technology could be used in societies where homosexuality is criminalised or socially unacceptable, and hope highlighting it now will alert policymakers to the possibility of machine abuse in future.