Connect to Us LinkedIn Youtube RSS


Aruba, September 25, 2017 - Artificial intelligence keeps getting creepier. In one controversial study, researchers at Stanford University have demonstrated that facial recognition technology can identify gay people with surprising precision, although many caveats apply. Imagine how that could be used in the many countries where homosexuality is a criminal offense.

The lead author of the “gaydar” study, Michal Kosinski, argues that he’s merely showing people what’s possible, so they can take appropriate action to prevent abuse. I’m not convinced.
When people hear about algorithms recognizing people through masks, finding terrorists and identifying criminals, they tend to think of dystopian movies like “Minority Report,” in which Tom Cruise prevented murders with the help of “precogs” -- human beings with supernatural, albeit fatally flawed, foresight caused by a childhood neurological disease.
Reality is much worse. We don’t have precognition. We have algorithms that, although better than random guessing and sometimes more accurate than human judgment, are very far from perfect. Yet they’re being represented and marketed as if they’re scientific tools with mathematical precision, often by people who should know better.
This is an abuse of the public’s trust in science and in mathematics. Data scientists have an ethical duty to alert the public to the mistakes these algorithms inevitably make -- and the tragedies they can entail.
That’s the point I made in a recent conversation with Kosinski, who is also known for creating the “magic sauce” psycho-profiling algorithm that Cambridge Analytica later adapted to campaign for both Brexit and Donald Trump.
His response was that we’re both trying to warn the world about the potential dangers of big data, but with different methods. He’s showing the world the “toy versions” of algorithms that can and surely are being built with bigger and better data elsewhere -- and he doesn’t derive any income from the commercial applications. Academic prototyping, if you will.