By Natalia Castro
Facebook has become the gateway to information in the social media world, but with this popularity comes heavy responsibility. As Facebook attracts a global network of billions of users, speech which advocates violence has become prevalent across the site, raising questions on the role Facebook plays in moderation.
Ultimately, Facebook is allowed to have an editorial bias under the First Amendment, just like any media outlet. The problem erupts when the “openness” model of Facebook promotes the growth of violent ideology; despite clear community standards against violence — Facebook says it “remove[s] content that expresses support for groups that are involved in the violent or criminal behavior” — these rules can be difficult to enforce, giving users the responsibility of reporting abuse.
As Joe Newby and Adina Kutnicki explains in Banned: How Facebook Enables Militant Islamic Jihad, “In June 2014, Islam Exposed (The Truth about Militant Islam), was unpublished after Facebook claimed the page included pornographic content… Page administrators sent a letter to Facebook that contained screen shots of the page to prove that no such content existed… Facebook yanked the page but restored it, claiming that it was the result of a mistake. At that time, page administrators received death threats from critics but those threats were dismissed by Facebook. Instead of dealing with the threats leveled at administrators, Facebook chose to tear the page down, citing ‘harassment.’ In short, Facebook punished the victims for being threatened.”
This occurs consistently with groups in ranging ideologies. Newby and Kutnicki continue to explain that Britain First, a right of center political party with 1.1 million supporters, had their page pulled with no explanation. Facebook restored the page, only after the groups established “an immediate legal fund to drag Facebook through court” under the claim they violated their supporter’s freedom of speech and expression.
Facebook has also come under fire after a 2016 Gizmodo report uncovered that Facebook algorithms and curators often suppress conservative sources in favor of “neutral outlets” like the New York Times and CNN, Facebook pledged the solve the problem and even donated $62,500 in cash to the Conservative Political Action Conference (CPAC).
While certain political viewpoints are running into problems — likely due to abuse of the reporting feature — others that appear to be violating Facebook’s own community standards against violent ideology are still allowed on Facebook.
Newby and Kutnicki provided the example of a Facebook page promoting the beheading of infidels, stating, “When we Muslims take over America, you infidels will show respect or die.”
Following public outcry, the page was eventually removed from Facebook. Yet due to the openness of Facebook groups that promote violence can continue to take advantage of the site as a tool for recruitment to their ideology and ultimately their cause. Some facets of Facebook can even help maintain their anonymity and protect these groups from exposure.
Like Antifa. These groups are hiding in plain sight. Facebook has three types of groups: public, which anyone can see the group, its members, and their posts; closed, anyone can find the group and see who is in it but only members can view posts; and secret, only members can find the group, see who is in it, and see posts.
Despite Facebook’s model for open and interactive communication, these closed and secret groups can become a hotbed for fostering radical violent ideology without suppression or oversight.
One Antifa group has a closed group with 4,824 members. The description explains that the group connects the international so-called “anti-fascist” community for easy communication and information sharing.
Pages such as Berkeley Antifa even write on their page, “While we truly appreciate all the love shown to us through our Facebook page, we ask that our supporters refrain from liking or commenting on any of our posts… We hide all comments made to this page in order to minimize risk for all involved. If your comment is hidden, it will still be visible to you and your friends. We encourage you to continue to share our posts and pass them around to your friends… If your Facebook account is truly privatized, liking posts made to leftist/antifascist pages is not nearly as much of a risk.”
A number of other closed Antifa groups can be found rather easily on Facebook. Why they are still there after all the violence is a good question.
For example, considering 13 members of the group were just arrested at Berkeley University for charges ranging from assault with a deadly weapon to obstructing a police officer as part of an organized violent protest against peaceful conservative demonstrators, it is clear these closed groups are fueling violent principles under the radar.
The American people are at a dynamic crossroads with this issue as well. A poll by the Center for Security Policy/Eagle Forum/McLaughlin & Associates shows 85 percent of the American people support free speech, while 63 percent oppose Antifa for their violent acts against free speech. They don’t agree with shutting down unpopular views, even racist ones, with violence or censorship.
This could put companies like Facebook in a delicate position. As Rick Manning, President at Americans for Limited Government explains, “The fact that the First Amendment maintains broad support across America clearly shows that freedom runs through the very DNA of the American people, and those who seek to undermine basic freedoms or provide aid and comfort in that effort, do so at their own political peril. Private companies like Google, Facebook, Twitter and Paypal need to proceed cautiously as they consider becoming arbiters of appropriate speech on the Internet, because if Americans believe that they have relinquished their neutrality, they will lose their dominance to market alternatives.”
This creates the compelling question in the information age as to where the responsibilities of Facebook begin and end. Do closed groups being used by violent actors like Antifa violate Facebook’s guidelines? It is hard how to see they don’t.
As the publicly traded Facebook with over $490 billion market cap continues to grow, these questions must be addressed.
In order for Facebook to remain credible it must continue proving that it is committed to being an unbiased and secure network where the marketplace of ideas can flourish. On the other hand, how it handles violent speech from radical groups is an ongoing question. How about addressing Antifa? That would be a good start.
Natalia Castro is a contributing editor at Americans for Limited Government.
322 W. Compton Blvd.
Suite 100 B
Compton, CA 90220