Who really decides what we see online? New research exposes the hidden power social media platforms hold in shaping political debate and democratic life.
What happens when a handful of tech giants decide what political content we see online, or what we don't see?
That’s the question driving Dr Andreu Casas, a political scientist at Royal Holloway. His research explores how social media platforms moderate and create political content and what this means for public debate and policy decisions.
Social media has long been known for influencing political change. But as platforms tighten control over content, the stakes have never been higher. Dr Casas’ work shines a light on these hidden dynamics, asking urgent questions about transparency, accountability and the future of democratic speech.
“Social media is where younger generations get their political news,” Dr Casas explains. “They might get some from mainstream newspapers or the BBC, but they’re seeing those stories on Instagram or TikTok. That’s the platform shaping their worldview.” Yet, behind the scenes, decisions about what content is shown and what is suppressed are made by just a handful of private companies - Meta, Google, TikTok - each with global reach and enormous influence.
“Who decides what voices are amplified or silenced?” Dr Casas asks. “Right now, it’s a handful of companies. And that should concern all of us. There is very little research in this area and that’s what I’m trying to address.”
Geopolitics and the TikTok controversies
In recent years, concerns have emerged regarding politically motivated censorship by Western social media companies and Dr Casas wanted to address this. He recently tracked 600,000 users who follow Iranian elites on Twitter/X to see how they were moderated. The results show that conservative users and those supportive of the Iranian government were significantly more likely to be suspended – highlighting how global politics can influence social media moderation.
Dr Casas’ latest project investigates TikTok and comes amid global debates about banning the app in Western countries, specifically the USA, where policymakers fear that TikTok amplifies pro-China content. And in the US, people worry that TikTok will start censoring voices critical of Trump, especially since 22nd January, when TikTok operations in the US were taken over by a new US-based company with close ties to the administration.
On 25th January many TikTok users complained about not being able to post videos mentioning ICE, Alex Pretti, or Jeffery Epstein. However, preliminary findings from Dr. Casas, in collaborations with other international scholars, showed that these irregularities were more likely to be the result of power outages than targeted censorship on the part of TikTok. “This doesn’t mean that the way TikTok moderates content in US won’t change in future, but we’ll remain vigilant,” Dr. Casas says.
The London Social Media Observatory

In the UK Dr Casas has launched the London Social Media Observatory - a hub for studying how platforms moderate and create political content and its impact on democratic processes.
By examining how social media platforms moderate political content, we can better understand its impact on democratic processes.
The observatory, which launched in December 2025, is backed by a £2.5 million Future Leaders Fellowship and partners with several stakeholders, such as the UK Electoral Commission and the Westminster Foundation for Democracy (WfD). It will analyse moderation during the UK elections and assess compliance with the Online Safety Act. It will also publish quarterly policy and data reports for the public, offering accessible insights into trends like where banned influencers migrate after suspension.
The first policy brief on TikTok, Politics and Elections is already available, co-produced with the WfD. It reflects the concerns discussed by a group of policymakers, academics, civil society, content creators, and strategic advisors, during an event co-organised with WfD that took place in Royal Holloway's Stewart House in London on 8th December 2025.
How it all began
Studying how platforms moderate political content is incredibly challenging because it requires vast amounts of social media data collected in real time, along with advanced computational tools to analyse millions of videos, images, and text. “Platforms make it harder and harder to collect data,” Dr Casas explains. “You need the right infrastructure, machine learning models, and a mix of political science, advanced computational linguistics and computer science skills. That’s why only a handful of researchers worldwide are working on this.”
Dr Casas’ own journey began when he was about to finish his undergraduate degree in 2011. This was the time of the Arab Spring protests and the anti-austerity movements in Spain and the USA. These movements - protesting economic inequality, political corruption, and lack of opportunity - used social media to spread ideas, organise protests, and build global solidarity.
|
“I realised early on that social media was going to be really important for politics moving forward and if I wanted to study it, I needed to learn new skills,” says Dr Casas. “To study it properly, I had to learn computational methods and tailor my PhD program to combine political science with data science.” |
Tracking suspensions and bias
Recently he tracked thousands of YouTube channels posting about politics during the 2024 US presidential election. The findings were striking: conservative channels were suspended at higher rates than liberal ones. At first glance, this seemed to confirm claims of bias from conservative groups. But Dr Casas dug deeper.
“When we controlled for misinformation and hate speech, ideology stopped being a relevant predictor,” he says. “The reality is that extreme conservative channels were producing most of the hateful language and misinformation. That’s why they were removed. This distinction is important because it shows the issue isn’t just about censorship. It highlights complex trade-offs platforms face between safeguarding free expression and mitigating harmful content.”
“This isn’t just about academic papers,” Dr Casas emphasises. “It’s about informing policymakers, engaging the public and it’s a place where future researcher can grow.”
_________________________________________________________________
Research Paper:
Policy Brief:
News Coverage
Good Authority: Was there censorship on TikTok after the U.S. takeover?
NPR: Researchers say no evidence of TikTok censorship, but they remain wary
Return to our Research in Focus page to uncover more exciting research happening at Royal Holloway, University of London.
Research in focus