What Are You Looking At?

Stephen Wunderli
by Stephen Wunderli — 6 months ago in Technology 2 min. read
26

Facial recognition and other biometrics are all the rage. Easy, useful, secure. And easy to hack. Many phones can be opened with just a photo of the owner’s face. And once your face is in a database, who knows where it goes? Or, more importantly, what “you” start to look like.

Facial recognition is not foolproof. Relying completely on AI to identify a person is risky (see Tom Cruise in Minority Report). There are cases of facial recognition being hacked by a photograph. There’s data mix-ups on the backend since humans are inputting the data. And there is always the possibility of a hack, which could give you two online identities without you even knowing.

Oscar Wilde’s disturbing, and classic novel: The Picture of Dorian Gray provides a young aristocrat with an alter ego. Much the way a digital identity can be stolen and used to create an entirely new person who can live a Hedonistic life on your dime. Dorian indulges himself in every sensual way imaginable without the usual wear and tear. He remains youthful and attractive. But the portrait of him bears the scars of his debauchery. With no consequences to shoulder, Dorian Gray is free from any morals and spends his days deceiving and exploiting for his own pleasure. If you’ve ever had your identity stolen, you know that the thief didn’t just buy food essentials, but rather every indulgent thing he could get away with. You can just see him “Party’s on me!” While we, the victim, sit alone in the drawing room like the portrait, gathering dust and aging horrifically from the stress since there is really nothing we can do.

A portrait paints a thousand words. And so does facial recognition. 

Protecting security and privacy means understanding how to prevent misuse of facial recognition software. First: know that facial recognition software isn’t foolproof. In fact, it has a built-in bias: it’s more accurate when identifying white males, and less accurate when identifying black females. This is very concerning when it comes to identifying criminals based on public records. Innocent people have been wrongly identified. Now there is even a software program that can alter the digital points of your face to make it “not you.” the creator says: “Think of it as an Instagram filter.” 

The ubiquitous use of facial recognition makes these flaws worrisome. Banks. Security companies. Facebook. The technology is spreading faster than fake news. 

Maybe it’s time to look closely at who we are and what we put our faith in. Technology, particularly machine learning in this case, is never foolproof. The best privacy and security tools use a combination of biometrics, machine learning, hardware, third-party verification, and human interaction—actually relying on the user’s brain to remember or change a sequence of words or numbers in order to validate identity. There might even come a time when a DNA signature is required. 

Until then, beware of what you look at, and what you trust. You don’t want any wretched portraits of yourself hanging around the Internet. 😉

Spock and ClearOS
Friday, 24 May 2019
Decentralized Storage and the Cluttered Close...
Tuesday, 11 February 2020
How Digital Currency is Like an Old Sears and...
Friday, 24 May 2019

Don't Miss Out