And what did he do exactly? By using a newly-built AI program, he was able to scan more than 30,000 photos on an unnamed online dating site, and later determine whether or not the individuals pictured in them were straight or gay. Surprisingly, the program was able to accurately identify which men and women were gay, 81 percent of the time and 71 percent of the time, respectively. Compared with humans who were asked to do the same task, the program performed much better -- the humans guessed correctly only 54 percent to 61 percent of the time. In short, the robot "gaydar" did a better job.
The AI program used to do the scans relied on a number of different factors to make its guesses as to each individual's sexual orientation. It is said that the program was designed to figure out a reliable pattern that could efficiently distinguish a person's face and determine whether or not that person was straight or gay. According to the researchers, the program ended up using individual differences in facial structure to make its decisions -- gay men's faces were said to be more feminine while lesbian women's faces appeared to be more masculine.
All things considered, the results of the research were pretty much in line with the so-called prenatal hormone theory of sexual orientation, according to the researchers. This theory posits that human sexuality is determined, in large part, by hormone exposure in the womb. This is currently considered a controversial view by some colleagues of the researchers, who also say that the research is biased because the photos scanned were uploaded by the users themselves and weren't produced in a neutral lab setting.
The details of this study, which were first released in September of 2017 and are scheduled to appear in the February edition of the journal Personality and Social Psychology, show both the benefits as well as the pitfalls of using AI and facial recognition to determine certain traits of individuals. Lead author of the study Michal Kosinski, a Polish psychologist that studies human behavior by using online data, said that his overarching purpose for conducting the study is simply to warn people about the ongoing erosion of online privacy. However, LGBTQ rights groups have decried the research, saying that it endangers the LGBTQ community in entirely new ways.
In any case, this study and its results show quite clearly that the use of a capable AI program on certain sets of human data could lead to a massive invasion of online privacy. As you may already know, not every gay person on Earth is openly gay. By using the algorithm from the program used in this study, the fact that some people haven't "come out" yet wouldn't matter -- not one bit.
Imagine an employer or an educational institution using it as part of the pre-screening process -- would you be comfortable with that? Or how about as part of a government survey or a census, in a country where homosexuality is considered a crime -- the closeted gay people living there wouldn't stand a chance. "Basically, going forward, there's going to be no privacy whatsoever," said Kosinski. With technology just like the one that he developed, there certainly wouldn't be.
Read more about the issues surrounding AI and online privacy on Robots.news.
Sources include: