Support our Nation today - please donate here
News

Oversight Board to examine Facebook posts about summer riots

03 Dec 2024 3 minute read
A firework is thrown towards police during an anti-immigration demonstration near the Holiday Inn Express in Rotherham, South Yorkshire. Photo Danny Lawson/PA Wire

The Oversight Board which examines content moderation decisions made by Meta’s social platforms is to look at three cases linked to posts shared during the summer riots in the UK.

Violence erupted across the country after a knife attack in Southport which killed three girls and injured eight others, fuelled by misinformation spreading rapidly on social media about the attacker’s identity, including false claims that he was an asylum seeker who had arrived in the UK on a small boat.

There have since been calls to tighten online safety laws to better respond to misinformation and disinformation because of the real world impact it can have.

Incitement

The Oversight Board has now confirmed it will look at cases involving three posts from that time which were reported to Facebook for violating either its hate speech or violence and incitement policies.

The first post expressed agreement with the riots, called for mosques to be attacked and buildings to be set on fire which housed migrants.

The second piece of content was a reshare of another post. It showed what appeared to be an AI-generated image of a giant man wearing a Union flag T-shirt who is chasing several Muslim men, and included overlay text providing details of when and where to meet for one of the protests.

The third post is another AI-generated image, of four Muslim men, running in front of the Houses of Parliament after a crying blond-haired toddler in a Union flag T-shirt, with the image carrying the caption “wake up”.

Automated tools

All three posts were originally kept on Facebook after being assessed by Meta’s automated tools – none of the posts were reviewed by humans – before the same users who had reported the posts appealed to the Oversight Board over the decision.

The board said it had selected these cases to examine Meta’s policy preparedness and crisis response to violent riots targeting migrant and Muslim communities.

It said that as a result of selecting these cases, Meta has now determined that its previous decision to leave the first post on Facebook was an error and has removed it.

The social media giant confirmed to the board it still believes its decisions to leave the second and third post on Facebook was correct.

Public comments

The Oversight Board said it would now accept public comments on the issue, including the role social media played in the UK riots and the spreading of misinformation.

It is expected to issue decisions on the cases in the coming weeks, and can make policy recommendations to Meta, which although not binding, must be responded to by the tech giant within 60 days.


Support our Nation today

For the price of a cup of coffee a month you can help us create an independent, not-for-profit, national news service for the people of Wales, by the people of Wales.

Subscribe
Notify of
guest

2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Erisian
Erisian
8 days ago

Moderators cost money. Dangerous inaction is profitable.
The child Zukerburg made his choices long ago.

Jeff
Jeff
8 days ago

Oversight board? Self inspection? None of the platforms have moderation they would require for their own families. Musk is off amplifying racists and hate and fascists and basically off with the fairies, Meta chief is kissing donnys ring cos he wants to keep his billions, what will this achieve?

Governments must hold the platform owners to task. Fine them a few bucket loads of cash and they will find the moderation.

Our Supporters

All information provided to Nation.Cymru will be handled sensitively and within the boundaries of the Data Protection Act 2018.