Facebook has apologized after its AI slapped an egregious label on a video of Black men. According to The New York Times, users who recently watched a video posted by Daily Mail featuring Black men saw a prompt asking them if they’d like to “[k]eep seeing videos about Primates.” The social network apologized for the “unacceptable error” in a statement sent to the publication. It also disabled the recommendation feature that was responsible for the message as it looks into the cause to prevent serious errors like this from happening again.
Company spokeswoman Dani Lever said in a statement: “As we have said, while we have made improvements to our AI, we know it’s not perfect, and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations.”
Gender and racial bias in artificial intelligence is hardly a problem that’s unique to the social network — facial recognition technologies are still far from perfect and tend to misidentify POCs and women in general. Last year, false facial recognition matches led to the wrongful arrests of two Black men in Detroit. In 2015, Google Photos tagged the photos of Black people as “gorillas,” and Wired found a few years later that the tech giant’s solution was to censor the word “gorilla” from searches and image tags.
The social network shared a dataset it created with the AI community in an effort to combat the issue a few months ago. It contained over 40,000 videos featuring 3,000 paid actors who shared their age and gender with the company. Facebook even hired professionals to light their shoot and to label their skin tones, so AI systems can learn what people of different ethnicities look like under various lighting conditions. The dataset clearly wasn’t enough to completely solve AI bias for Facebook, further demonstrating that the AI community still has a lot of work ahead of it.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.