

A 黑料正能量 researcher has contributed vital evidence to a UK Government report underscoring the need to address the spread of online misinformation monetised by advertising
11 July 2025
5 minutes
- New report outlines flaws in the Online Safety Act (OSA), noting it fails to address misinformation that arises from algorithms that favour ad views and user engagement.
- Social media and digital advertising business models focus more on profit results than on the safety of the UK public, with digital advertising being prioritised as a financial engine
- The report emphasises the need for further action by the UK government to better govern digital advertising, highlighting the need for better regulation
As a result of a recent social media inquiry, a government report published today (11 July) highlights growing concerns about online safety and how digital platforms indirectly profit from social unrest through advertising revenue.
The report, , highlights critical evidence on the risks associated with social media, provided by several experts, including Dr Karen Middleton, a Senior Lecturer from the School of Strategy, Marketing and Innovation at the 黑料正能量.
Published by the , the report emphasises the need for new standards, addressing gaps in the regulation of digital advertising as well as the legislation on generative AI.
Expanding on the from earlier this year in which the committee, including Dr Middleton, discussed a rising concern in UK online safety laws, the report highlights the concerns over the 鈥檚 role in regulating areas such as misinformation and AI. The government inquiry also heard from social media platforms and those affected by the Southport riots, highlighting the need for additional action to address the spread of online misinformation monetised by advertising.
The Online Safety Act was primarily designed, rightly to protect children from online harm. However, in the report, MPs explore flaws in Ofcom's current regulation of digital media as it overlooks online misinformation. The report explains that digital media algorithms favour ad views and engagement above authenticity and safety. This prompts concerns over the safety of the UK public and the report urges the government to further regulate social media companies to tackle the spread of false content online.
Digital advertising has also been identified as an issue, with the business models of social media companies encouraging the algorithmic spread of engaging content. Dr Middleton found that the algorithmic spread of false and harmful content has strong ties to the digital advertising market, which was estimated at
Dr Middleton, from the Faculty of Business and Law at the 黑料正能量, said: 鈥淒igital advertising is the financial engine behind much of the content we see online, including harmful misinformation. The current system is complex, opaque, and profit-driven, and has allowed disinformation networks and even criminal enterprises to thrive unchecked.
Digital advertising is the financial engine behind much of the content we see online, including harmful misinformation. The current system is complex, opaque, and profit-driven, and has allowed disinformation networks and even criminal enterprises to thrive unchecked.
Dr Karen Middleton, Senior Lecturer, School of Strategy, Marketing and Innovation
鈥淲ithout transparency and accountability, advertisers can unknowingly fund content that undermines public trust, polarises society, and threatens democracy.
鈥淚 was very pleased to be able to contribute to the Select Committee鈥檚 important work on this issue. It is clear that we need a more robust, joined-up approach that strengthens the Advertising Standards Authority鈥檚 remit while aligning the efforts of key government bodies like Ofcom and the Information Commissioner's Office.鈥
The report was informed by Dr Middleton鈥檚 research on how online media algorithms are designed to focus on spreading content that is often sensationalist and emotional, which can be potentially harmful and detrimental to individuals, as well as society generally. This is not limited to one platform, but instead spans across the internet as the entire digital advertising system is designed to promote content that will perform well, as opposed to safe and ethical content.
Platforms such as TikTok and Meta have argued this, explaining that they believe harmful content does not align with their business interests.
The Chair of the , Dame Chi Onwurah MP, said: 鈥淪ocial media can undoubtedly be a force for good, but it has a dark side. The viral amplification of false and harmful content can cause very real harm, helping to drive the riots we saw last summer. These technologies must be regulated in a way that empowers and protects users, whilst also respecting free speech.
鈥淭oday鈥檚 report sets out a way forward for the government to ensure that people in the UK can stay safe online and control what they see, by disincentivising the viral spread of misinformation, regulating generative AI, and placing much-needed new standards onto social media companies.
鈥淎 national conversation is already underway on this vital issue 鈥 we look forward to the government鈥檚 response to our report and will continue to examine the consequences of unchecked online harms, particularly for young people, in the months to come.鈥
The report highlights the need for better regulation, emphasising a need for a new online safety regime focusing on public safety, free and safe expression, responsibility for content, control over content and data, and technological transparency.
The implementation of this will help to support the safety of the UK public in online spaces, with a hopeful reduction of misinformation.
鈥淭o protect the public interest and restore integrity to our digital spaces, it is vital that the government steps in to mandate clearer oversight of ad placement, strengthen regulation of algorithmic targeting, and hold platforms accountable for the content they monetise,鈥 added Dr Middleton.
Header image credit: Parliament UK
More like this...
Portsmouth expert gives evidence at UK Government social media inquiry
19 March 2025
3 minutes

Business expert joins Revolution Plastics Institute
17 March 2025

University plastics expert attends sustainability event at Buckingham Palace
7 November 2024
2 minutes
