Far-right group Britain First banned by Facebook

Far-right group Britain First has been banned by Facebook.

The extremist far-right group, which rallies against racial diversity, boasted more than a million followers – making it the largest political Facebook page in the UK.

The page has also previously lashed out at LGBT equality, attacking reality dating show for featuring a lesbian couple last year and repeatedly lashing out at transgender rights.

The movement has been linked to ‘raids’ on mosques and extremist rallies across the UK, with its leader Paul Golding and deputy Jayda Fransen both jailed earlier this month for religiously aggravated harassment.

The page – which has previously been temporarily suspended on a number of occasions – was permanently deleted today, Facebook confirmed.

Protesters hold placards and British Union Jack flags during a protest (Photo by Chris J Ratcliffe/Getty Images)

Facebook said in a statement: “People come to Facebook to express themselves freely and share openly with friends and family, sometimes this can include their political views.

“Some political opinions might be controversial, but it is important that different views can be shared and we are very careful not to remove posts or Pages just because some people don’t like them.

“We are an open platform for all ideas and political speech goes to the heart of free expression. But political views can and should be expressed without hate. People can express robust and controversial opinions without needing to denigrate others on the basis of who they are.

“There are times though when legitimate political speech crosses the line and becomes hate speech designed to stir up hatred against groups in our society. This is an important issue which we take very seriously and we have written about how we define hate speech and take action against it in our Hard Questions series.

“We have Community Standards that clearly state this sort of speech is not acceptable on Facebook and, when we become aware of it, we remove it as quickly as we can. Political parties, like individuals and all other organisations on Facebook, must abide by these standards and where a Page or person repeatedly breaks our Community Standards we remove them.”


A protester gestures towards members of ‘Unite Against Fascism’ (Photo by Chris J Ratcliffe/Getty Images)

It added: “Content posted on the Britain First Facebook Page and the Pages of party leaders Paul Golding and Jayda Fransen has repeatedly broken our Community Standards.

“We recently gave the administrators of the Pages a written final warning, and they have continued to post content that violates our Community Standards. As a result, in accordance with our policies, we have now removed the official Britain First Facebook Page and the Pages of the two leaders with immediate effect.

“We do not do this lightly, but they have repeatedly posted content designed to incite animosity and hatred against minority groups, which disqualifies the Pages from our service.”

Britain First leader Paul Golding and deputy leader Jayda Fransen (Photo by Charles McQuillan/Getty Images)

Facebook reaffirmed its ban on homophobic and transphobic hate speech under a new set of community guidelines.

The guidelines state: “Facebook removes hate speech, which includes content that directly attacks people based on their race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, or gender identity, or serious disabilities or diseases.

“Organizations and people dedicated to promoting hatred against these protected groups are not allowed a presence on Facebook. As with all of our standards, we rely on our community to report this content to us.”

Addressing pages such as PinkNews, which regularly draws attention to listed hate groups, it clarified: “People can use Facebook to challenge ideas, institutions, and practices. Such discussion can promote debate and greater understanding.

“Sometimes people share content containing someone else’s hate speech for the purpose of raising awareness or educating others about that hate speech.

“When this is the case, we expect people to clearly indicate their purpose, which helps us better understand why they shared that content.”

Monika Bickert, the company’s Head of Global Policy Management, said: “It’s a challenge to maintain one set of standards that meets the needs of a diverse global community.

“For one thing, people from different backgrounds may have different ideas about what’s appropriate to share — a video posted as a joke by one person might be upsetting to someone else, but it may not violate our standards.

“This is particularly challenging for issues such as hate speech.

“Hate speech has always been banned on Facebook, and in our new Community Standards, we explain our efforts to keep our community free from this kind of abusive language.

“We understand that many countries have concerns about hate speech in their communities, so we regularly talk to governments, community members, academics and other experts from around the globe to ensure that we are in the best position possible to recognize and remove such speech from our community.

“We know that our policies won’t perfectly address every piece of content, especially where we have limited context, but we evaluate reported content seriously and do our best to get it right.”