Facebook’s New Anti-Extremist Warning Isn’t Very Helpful

Facebook has gone through a lot of evolution in the past couple of years and, like many good companies, has tried its best to protect their users from seeing something that might make them upset or stressed. That’s why, for some content and issues, Facebook often gives warnings to their users before letting them view anything that might be emotionally distressing. However, their newest warning might not be as helpful as they think.

Shutterstock

Because there had been talk about extremist content on social media being on the rise, Facebook has instituted a new warning that warns its users if it has been suspected that they had recently viewed “extremist content”.

The warning reads: “You may have been exposed to harmful extremist content recently”, followed by “Violent groups try to manipulate your anger and disappointment. You can take action now to protect yourself and others.”

If there’s one thing that Facebook is very keen about, it’s protecting its users from content they don’t want to see, and helping them see content that they do want to see. However, while the intentions of this new warning are good, it hasn’t exactly hit the nail on the head yet.

Shutterstock

Whenever you receive the warning, it does not actually tell or hint to you what the content you saw was. When this happens, the best thing that you can do is guess what you saw, but then that means you actually need to look for the extremist content that you saw, and that kinda just defeats the whole purpose for the warning, doesn’t it?

The warning also offers a button for a link on how users can “get support from experts” if they have feared they had been exposed to extremist content. However, when people actually try to click on the link, they usually only get an error message that reads “This feature isn’t available to everyone right now”.

This suggests that the warning is still in its testing phase, and the bugs hadn’t been properly worked out yet. This is a tad bit irresponsible on Facebook’s part because it is unhelpful for viewers who might actually need help, but are unable to receive it because of this bug.

There has also been a lot of buzz about this warning going on because there has been talk about an alternate warning circulating around: Instead of warning you that you’ve been exposed to extremist content, it asks “Are you concerned that someone you know is becoming an extremist?”.

This version of the warning had actually charted up laughter amongst users, mostly about imagining their sweet friends or family members as possible extremists. Users had started jokes about it feeling like “George Orwell” and if they hadn’t received the warning about their friend being an extremist, does that make them the extremist friend?

Because of the state of the world today, it is somewhat responsible of Facebook to have a warning system in place for extremist content, but it is highly suggested that Facebook put the system through its paces before installing it into their site.