

As Facebook has gone through the paces to try and better itself for the newer generation, there has been a lot of trial and error with the new types of software that the Facebook staff creates to appeal to all different types of people.
However, it’s important to note that not every single new software isn’t without its faults, and while it’s not always intentional for the staff to make mistakes, it doesn’t mean the staff is completely blameless for the event, nor does it mean that there aren’t consequences to these actions. One of these recent accidents proved to be very damaging, and people are unsure how to move past it.
Shutterstock
While Facebook users were watching a video posted by Daily Mail on July 27th, most, if not all, were appalled to see a auto-prompt, created by Facebook’s main AI, that appeared after the video ended, asking the viewers if they would like to “keep seeing videos about Primates”.
You might be asking yourself, “what’s wrong with an AI asking their viewers if they want to watch more videos about monkeys”? See, here’s the problem: the video that Daily Mail posted didn’t feature any monkeys of any kind. In fact, the most prominent thing featured in the video were black peoples. See what the problem is, now? (If not, it’s suggested that you do your own research about the racial nicknames that black peoples have had to endure for the past hundred years).
Nonetheless, the auto-prompt has sparked a whole lot of (understandable) outrage from a lot of people. Of course, it didn’t take long for Facebook to catch wind of this horrible blunder, and they have already began to take the necessary action to try and reverse as much of the moral damage as they possibly can.
Shutterstock
Facebook has already come out with their own statement regarding the matter, with them explicitly stating that the auto-prompt that had somehow made its appearance has absolutely nothing to do with Facebook’s own view on the black community.
Facebook has already began to take the necessary actions to keep mistakes, like the ones that had previously happened, from happening again. The website has already made sure to disable the auto-prompt, along with the whole “Primates” subject to make sure that it isn’t recommended again.
Facebook, while it does seem to be a stereotype of racially-prejudiced people to congregate on shared beliefs, has, like many other social media apps, tried to make its app friendlier and more acceptable to the more “woke” generation of users.
Of course, a concept that Facebook is making sure to push again and again that, while AIs have come a very long way from when they were first created, they are still far from perfect. Technology has always been full of flaws, and AIs, for some reason, have always had some sort of racial bias within their programming, for some reason. The only thing that we can hope for is that the programming improves over time so that a moral blunder like this doesn’t occur again.