www.thethings.com

Coders Working To Solve Major Facial Recognition Software Problem

Coders Working To Solve Major Facial Recognition Software Problem

There's a problem with facial recognition software: it doesn't see race.

And not in the good way where everybody is the same and we’re all treated equally. As in it literally doesn't see races other than white.

Nobody knows this problem better than Henry Gan, who works as a software engineer at Gfycat. The company that is obsessed with internet memes is currently testing the use of artificial intelligence and facial recognition to perpetuate those memes.

How you ask? They're using facial recognition to try and recognize people's facial expression and then match that with a database of gifs. You then send that gif to your friends in place of your own face.

Why not send a picture of your own face? That's just not how the internet works.

Talk to Henry. He installed the software and started toying around with it around the office to see how well it recognized other employee's faces. He found that it identified all of the white employees pretty well, but it had a problem when it came to identifying the Asians in the office.

Passport
via New York Post

“It got some of our Asian employees mixed up,” Gan told Wired magazine. “Which was strange because it got everyone else correctly.”

This isn't just a problem for Gyfcat. Since most facial recognition software is provided by software giants like Microsoft and IBM, the entire industry is battling with the problem of racial bias in facial recognition.

RELATED: GOOGLE'S NEW APP HILARIOUSLY MATCHES YOUR FACE TO CLASSIC ART

Most facial recognition software operates by artificial intelligence, and that intelligence is only as smart as the information it’s provided. In Gyfcat’s case, they used open source software provided by Microsoft to create their own facial recognition program. To make their program, Microsoft used thousands of photographs provided by the University of Chicago and Oxford, however those photos were predominantly of people with European ancestry and very few of them were of African or Asiatic descent.

As a result, Gyfcat’s software had trouble identifying Asian employees. “Because of how the algorithm worked I expected it to be universally good,” Gan said. “Clearly that was not the case.”

To fix it, Henry had to develop an “Asian-detector” which caused the system to become more sensitive and use more stringent thresholds when it detected a face of Asian descent.

“Saying it out loud sounds a bit like prejudice, but that was the only way to get it to not mark every Asian person as Jackie Chan or something,” Gan says

Gyfcat’s improved system is now 98% accurate for whites and 93% accurate for Asians--a vast improvement, but one that other tech startups will have to invest in until the inherent biases found in humans aren’t passed on to our digital counterparts.

Mythbusters' Adam Savage Knows How To Chill A Room Temperature Soda In Less Than 2 Minutes (Don't Try This At Home)

More in Geeky