Source | www.strategy-business.com
One of the most controversial psychological studies in recent memory appeared last month as an advance release of a paper that will be published in the Journal of Personality and Social Psychology. Yilun Wang and Michal Kosinsky, both of the Graduate School of Business at Stanford University, used a deep neural network (a computer program that mimics complex neural interactions in the human brain) to analyze photographs of faces taken from a dating website and detect the sexual orientation of the people whose images were shown. The algorithm correctly distinguished between straight and gay men 81 percent of the time. When it had five photos of the same person to analyze, the accuracy rate rose to 91 percent. For women, the score was lower: 71 percent and 83 percent, respectively. But the algorithm scored much higher than its human counterparts, who guessed correctly, based on a single image, only 61 percent of the time for men and 54 percent for women.
Of course, methods like this could be used to out people who are closeted, or to falsely identify them as gay or lesbian. The LGBT advocacy groups GLAAD and the Human Rights Campaign jointly condemned the study as inaccurate, pointing out that it didn’t identify bisexuality and included no nonwhite faces. But as the Washington Post noted, there are even more fundamental issues at stake. Repressive governments, intolerant businesses, or blackmailers could use these indications to target individuals.
The study also raises other issues besides sexual orientation — issues with at least as much potential for invasion of privacy and for abuse. Algorithms like this rely on machine learning. Through repetition and calibration, the computer programs learn to match their models against reality and continually refine those models until they have immense predictive accuracy. A program of this sort could pick up attributes that human beings were absolutely unaware of — and glean immense amounts of insight about them. A world in which this is prevalent becomes a world like that of the film Minority Report, with people continually adjusting themselves to more “normal” behavior because the systems around them track not only what they have done, but what they might be capable of doing.