Corey Deshon, a professional photographer who uses Flickr. He uploaded a photo of a black man that was incorrectly tagged as “ape” on Flickr. Corey is holding a Canon AE-1 film camera he used to capture the image of the man who was tagged as “ape” by the website, Flickr, photographed inside his home. Slug PHOTOS0718 Assign ID 541234
All Jacky Alciné wanted from Google Photos was a space to upload snapshots from his life. But a few weeks ago, what he got from the photo-sharing service was racist insults and a smattering of stories in the national press after photos of him and a female friend were labeled “gorillas” instead of humans.
“That has to be wrong,” thought Alciné, who is black, as he shuffled through his library of photos, only to discover that dozens of other pictures were also labeled “gorilla.” “It’s unexcusable,” said the 22-year-old Brooklyn Web developer. “There’s no reason why this should happen.”
Photo-sharing services Google Photos and Flickr have come under fire recently for software that tags photos of black people as gorillas or apes, dredging up racist attitudes from the Colonial era. But in this case, instead of humans seeing with racist eyes, it’s their software.
“At high levels, what’s really going on is it’s just a kid that’s been raised in a particular neighborhood and doesn’t understand things from outside of its world that well,” said Vivienne Ming, a machine learning expert and co-founder of education technology firm Socos.
The problem is likely twofold, experts say. Not enough photos of African Americans were fed into the program that it could recognize a black person. And there probably weren’t enough black people involved in testing the program to flag the issue before it was released.
“We’re appalled and genuinely sorry that this happened,” Google said in a statement about the incident with Alciné. “There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these type of mistakes from happening in the future.”
Black staffers rare
Machine-learning experts say the issue — along with a surprisingly similar incident at Flickr — highlights a larger problem with diversity in Silicon Valley, where black people are dramatically underrepresented at big companies. Among Google’s U.S. employees, just 1 percent of the people who hold technical jobs are black. To be sure, the problem with Google Photos might have happened no matter who designed it; but theoretically, if more black staffers were plugging their own pictures into the service, someone would have caught the mistake.
“If you have a diverse workforce, then you have a much better chance of picking up on things that a lack of diversity would hide from them,” Ming said.
Both Flickr and Google declined to answer questions about the racial makeup of the engineering teams behind its photo services. But the numbers in the companies’ broader diversity reports provide some clue.
This year, Yahoo (which owns Flickr) said just 1 percent of its U.S. employees with technical jobs are black. In a category called “professionals,” including software engineers, black people represented 2 percent of the jobs, or 85 people, out of 4,073 employees, according to a 2013 report filed with the U.S. Equal Employment Opportunity Commission. Google has similar representation, with nearly 2 percent, or 369 black people, out of 22,130 employees in its “professionals” category, according to a 2014 report to the EEOC.
No Comments