Ofcom needs “emergency powers”

notion image
A think tank is urging stronger action to tackle misinformation on social media after the riots that followed the fatal stabbings of three schoolgirls in Southport
Ofcom should have emergency powers to fight misinformation and deamplify harmful posts to prevent public disorder, a leading think tank has recommended after a discussion with government officials and the Metropolitan Police’s Counterterrorism Unit.
Under proposals outlined by the Centre for Countering Digital Hate (CCDH) and seen by The Times, the regulator would be able to apply to a judge to gain emergency powers in a moment of crisis that would temporarily allow it to demand immediate action by social media platforms.
The CCDH said that this could also be achieved by amending the “special circumstances” directive in the Online Safety Act 2023. The provision allows the science, innovation and technology minister to give a direction to Ofcom if they have reasonable grounds for believing that there is a threat to national security or to the health and safety of the public.
notion image
X, owned by Elon Musk, has been accused of playing a significant role in the riots
GONZALO FUENTES/REUTERS
The proposed emergency powers could put platforms and their proprietors, such as Elon Musk, the owner of X, on a collision course with the regulator.
The powers are one of five recommendations to amend the Online Safety Act to battle misinformation on social media following the outbreak of riots across the country after the fatal stabbings of three schoolgirls in Southport.

Advertisement

Other recommendations include reducing misinformation risk assessment and reporting requirements, increasing capacity for researchers to identify emerging harms, addressing advertising and commercial incentives behind disinformation, and closing loopholes for smaller high-risk platforms such as Telegram.
The CCDH proposed the amendments after an emergency meeting on August 16 to discuss the role of social media in the riots and conduct a rapid “lessons learnt” exercise.
False claims that the suspect in the Southport attack was a Muslim and small-boat migrant are widely believed to have contributed to the social disorder that spread, leading to attacks on mosques and hotels housing asylum seekers.
The meeting was attended by top officials from the Home Office, the Department for Science, Innovation and Technology, Ofcom and the counterterrorism internet referral unit at the Metropolitan Police.

Advertisement

notion image
The riots after the Southport stabbings were fuelled by misinformation spread online
Representatives from the Community Security Trust (CST), which campaigns against antisemitism; the Incorporated Society of British Advertisers; Tell Mama, which documents incidents of anti-Muslim hate; the Online Safety Act Network and current and former Labour and Tory politicians were also at the meeting.
The organisations that attended the meeting have not explicitly endorsed the recommendations.
The Online Safety Act is scheduled to come into force next year. It has been criticised by campaigners for not going far enough and for failing to placing an obligation on tech giants to remove content that is “legal but harmful”.
According to analysis by the CCDH, the false identity of the Southport suspect was viewed on X at least 420,000 times and mentioned in 2,632 posts across five platforms on the day of the stabbing alone.
notion image
In one post, Musk wrote that “civil war is inevitable” in the UK
In one post, Musk wrote that “civil war is inevitable” in the UK
GONZALO FUENTES/REUTERS
Imran Ahmed, the chief executive of the CCDH, said that X stood out among social media platforms in how it was used by bad-faith actors to spread disinformation and sow chaos in the aftermath of the Southport stabbing.

Advertisement

“The owner of X, Elon Musk, personally shared false information about the situation to his 195 million followers and made a show of attacking the UK government’s response to the outbreak of violence,” Ahmed said. “Rather than ensuring risk and illegal content were mitigated on his platform, Musk recklessly promoted the notion of an impending ‘civil war’ in the UK.
“Musk has transformed Twitter, once the go-to source for journalists, politicians and the public for real time news, into X, a platform with imperceptible moderation and the morality of Telegram,” he said.
“We need all of us in British society — politicians, regulators, big companies advertising on social media, the press and the public — to decide if we want to continue this toxic relationship with an increasingly abusive boyfriend, X, or break up for good”.
In response to the CCDH’s proposals, a government spokesman said that government’s immediate priority was “swiftly and effectively” implementing the act to allow Ofcom to combat the spread of illegal content. Any changes must be balanced with the delays that they would cause to the act coming into force due to the need for Ofcom to consult the public, the spokesman added.

Advertisement

A spokesman for the regulator said that “any changes to the legislation would be a matter for government and parliament”.
The CST said that it was time for platforms such as X to be held to account for the spread of misinformation which it said had resulted in an “unprecedented increase in the levels of antisemitic, hateful and extremist content” on X.
Iman Atta, the director of Tell Mama, welcomed the proposal and called for the government to address the “misinformation, intimidation and violence” fuelled by social media platforms.
The Metropolitan Police and X did not respond to a request for comment.

Related articles