AI, artificial intelligence, is an interesting, but admittedly creepy, development in the tech universe.
And although some AI developments are beneficial to human existence, we recently discovered a branch of AI that redefines creepy.
The AI study
A new study from Stanford University discovered that AI, via deep neural networks, is more accurate than humans when it comes to detecting some’s sexual orientation via facial images. The researchers used a specific algorithm on about 35,000 facial images.
The AI “was successful 81 percent of the time [when] analyzing gay men and 74 percent when analyzing an image of gay women,” Dazed digital reports.
“The focus of this tech includes both permanent facial features and less fixed areas, i.e. grooming style.”
Dr. Michal Kosinski and Yilun Wang conduced the study.
Kosinski adds that this type of AI may be able to detect other human traits, such as IQ or political views.
You don’t have to watch an episode of “Black Mirror” to know that AI and technology can destroy humans.
And with the political and social climate the way it is, predictive AI is nothing short of dangerous.
“Imagine for a moment the potential consequences if this flawed research were used to support a brutal regime’s efforts to identify and/or persecute people they believed to be gay,” Ashland Johnson of the Human Rights Campaign (HRC) said in a recent statement regrading the AI research.
Brian Cugelman, Ph.D, echoes the HRC’s concerns. “In many countries, it’s illegal to be gay, and there are some parts of the world where you can be executed,” Cugelman says.
“The worry is that a country could scan Facebook, and place citizens at risk of prison because of their physiology.”
Tim Lynch, president of Psychsoftpc and Ph.D., adds that predictive AI can only hurt humans.
“This type of thing can be applied to other areas of life. We run the risk of completely losing our privacy,” Lynch says. “These suppositions [can be] used by advertisers, insurance companies, health care providers, employers, and law enforcement.”