As you know, I am not a fan of Facebook. And this incident has made me less of a fan. According to the BBC, Facebook users who watched a newspaper video featuring black men were asked if they wanted to “keep seeing videos about primates” by an artificial-intelligence recommendation system:
Facebook told BBC News it “was clearly an unacceptable error”, disabled the system and launched an investigation. “We apologise to anyone who may have seen these offensive recommendations.” It is the latest in a long-running series of errors that have raised concerns over racial bias in AI.
Their methodology for training the AI was probably flawed. Somewhere along the line the training set that this AI used likely contained black men and whatever was labeled “primates”. The AI correctly identified that yes, humans are primates, but the training set didn’t have anything in there to discern why “primate” was the wrong word to use. Which means that Facebook really needs to go back to the drawing board to come up with an AI that doesn’t show racial bias. Having a more diverse workforce would help with this seeing as Facebook has a problem with that. But, you know, baby steps.
Like this:
Like Loading...
Related
This entry was posted on September 7, 2021 at 8:47 am and is filed under Commentary with tags Facebook. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.
Facebook Apologizes After Its AI Labeled Black Men ‘Primates’…. WTF?
As you know, I am not a fan of Facebook. And this incident has made me less of a fan. According to the BBC, Facebook users who watched a newspaper video featuring black men were asked if they wanted to “keep seeing videos about primates” by an artificial-intelligence recommendation system:
Facebook told BBC News it “was clearly an unacceptable error”, disabled the system and launched an investigation. “We apologise to anyone who may have seen these offensive recommendations.” It is the latest in a long-running series of errors that have raised concerns over racial bias in AI.
Their methodology for training the AI was probably flawed. Somewhere along the line the training set that this AI used likely contained black men and whatever was labeled “primates”. The AI correctly identified that yes, humans are primates, but the training set didn’t have anything in there to discern why “primate” was the wrong word to use. Which means that Facebook really needs to go back to the drawing board to come up with an AI that doesn’t show racial bias. Having a more diverse workforce would help with this seeing as Facebook has a problem with that. But, you know, baby steps.
Share this:
Like this:
Related
This entry was posted on September 7, 2021 at 8:47 am and is filed under Commentary with tags Facebook. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.