From The Guardian:
An algorithm deduced the sexuality of people on a dating site with up to 91% accuracy, raising tricky ethical questions
Sam Levin in San Francisco, Thursday 7 September 2017
Artificial intelligence can accurately guess whether people are gay or straight based on photos of their faces, according to new research suggesting that machines can have significantly better “gaydar” than humans….
The machine intelligence tested in the research, which was published in the Journal of Personality and Social Psychology and first reported in the Economist, was based on a sample of more than 35,000 facial images that men and women publicly posted on a US dating website.
The researchers, Michal Kosinski and Yilun Wang, extracted features from the images using “deep neural networks”, meaning a sophisticated mathematical system that learns to analyze visuals based on a large dataset.
The research found that gay men and women tended to have “gender-atypical” features, expressions and “grooming styles”, essentially meaning gay men appeared more feminine and vice versa. The data also identified certain trends, including that gay men had narrower jaws, longer noses and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women.
Lesbians tend to seem like they just got dealt an overall more masculine set of genes on average than straight women. A lot of lesbians seem like they wouldn’t be lesbians if they had a better class of men hitting on them.
A lot of star women basketball players, for example, seem to just need an extremely tall and quite masculine guy to marry: 1970s women’s basketball player Ann Meyer found true love in the arms of Hall of Fame 6’5″ pitcher / raconteur Don Drysdale. Similarly, the friends of 6’5″ WNBA star Lisa Leslie found her a 6′-7″ black guy who is a cargo jet pilot to marry.
In contrast, male homosexuality seems more like a switch that is flipped. It’s not like diver Greg Louganis is gay because he doesn’t have enough muscles to attract a girlfriend.
Human judges performed much worse than the algorithm, accurately identifying orientation only 61% of the time for men and 54% for women. When the software reviewed five images per person, it was even more successful – 91% of the time with men and 83% with women. Broadly, that means “faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain”, the authors wrote.
I’m guessing the human brains used in this experiment belonged to unworldly undergrad psych majors rather than, say, 61-year-old Hollywood casting agents.