The Structural Disintegration of the Public Sphere 

March 11, 2025
By Peter Pomerantsev

The issue at hand extends beyond just “disinformation.” Russian state media and other bad actors exploit a much deeper crisis—one affecting the relationship between information and democracy. We are experiencing a fundamental breakdown in the public sphere. How should society engage in discussions, make decisions, and set political priorities? Historically, ancient Athenians had the Agora, a space where (elite, male, slave-owning) citizens debated important issues. In the early 20th century, Lord Reith envisioned the BBC as a national forum where people could unite around democratic values and reasoned debate. 

It’s worth revisiting what we mean by the vague concept of a democratic “public sphere.” The philosopher Jürgen Habermas idealized 18th century Britain as a birthplace of modern public discourse. He argued that when royal decisionmaking receded, a new bourgeoisie emerged, gathering in coffeehouses to discuss politics and commerce, forming civic groups, reading pamphlets, and influencing political parties. However, this was an exclusive, elite vision—more of a segment of society than a truly public sphere. Later, newspaper moguls further distorted public discourse, pushing their own agendas. Reith’s BBC aimed to counterbalance this by offering direct, impartial communication between leaders and citizens. The resulting media landscape became a mix of partisan newspapers, each representing different identities, with journalists convening on radio and television as voices of varied perspectives. 

This system, though flawed, had its benefits—it allowed millions of people to engage in national debates without direct interaction. In modern societies, we rely on institutions, media figures, and political parties to represent our identities and interests. However, this system also excluded many voices. The internet initially seemed to offer a solution, allowing individuals to participate directly, much like 18th-century pamphleteers. Early blogs encouraged open discussion, and the digital space seemed poised to become a global Agora. 

Then, social media changed everything. Control of the so-called “feed” fell into the hands of a few powerful individuals. Instead of directly engaging with ideas, users became subject to opaque algorithms that dictated what they saw, when, and why—often without their knowledge. These algorithms analyzed personal preferences, fears, and desires, shaping and manipulating user experiences. People became part of online crowds without understanding why they encountered certain content or whether it was promoted by genuine users or troll farms. The architects of these algorithms now shape not just news headlines but also perceptions of time, place, and personal aspirations. 

The concept of the public sphere was always somewhat mythologized, but even as an ideal, it now seems increasingly distant. In the United States, online-fueled conspiracy-driven mobs are on the rise, and election officials who upheld the legitimacy of the 2020 election have faced relentless harassment. Traditional print media, once a pillar of balanced discourse, is in decline. The very fabric of democratic processes, designed for an earlier era of information flow, is being tested. Authoritarian regimes, like those in Russia and China, point to this turmoil as proof that democracy is too chaotic to function in the digital age. They argue that centralized data control and top-down governance are better suited for today’s world, where technology can be used to design optimal cities and policies. At the 100th anniversary of the Chinese Communist Party, the number “5” symbolizing 5G technology loomed over the celebrations—suggesting a future of governance driven by total data centralization. 

So, what can be done? 

Regulation is a starting point. Free speech includes not only the right to express oneself but also the right to receive information transparently. Currently, social media companies operate behind closed doors. Radical transparency is needed to reveal how they function. Are they suppressing certain viewpoints? Are they enabling coordinated efforts to undermine elections? Are they taking action to prevent or incite violence? Are they protecting children from online exploitation? Independent researchers need access to platform data to answer these questions. The EU has implemented laws demanding such transparency, while the United States has not—ironically, making America’s information environment less free. Without transparency, citizens are left unaware of how they are being influenced by tech giants. 

Regulating individual pieces of speech is ineffective and often unethical unless they violate existing laws. However, users should have the right to understand why they see certain content, how their data is being used, and whether their children are safe online. They should also know if platforms are facilitating the manipulation of elections. Until figures like Elon Musk and Mark Zuckerberg reveal how their systems work, society will remain in the dark, shaped by unseen forces. 

Regulation alone is not enough. We need alternative digital spaces designed to foster productive discourse rather than amplify hate or exploit user data for profit. Google has proposed a so-called “Habermas machine”—an AI designed to mediate between opposing arguments. While this overlooks the importance of human deliberation, the idea of creating digital forums that encourage reasoned discussion remains valid. Eli Pariser, founder of Avaaz.org, suggests viewing the internet as a city. Currently, platforms like Facebook function as shopping malls, where users are both consumers and products. Other corners of the web resemble lawless backstreets, controlled by algorithmic “gangsters” who reward sycophants and punish dissenters. What’s missing are online equivalents of public spaces—town halls, libraries, and speakers’ corners. In Europe and the UK, where public service media have thrived, such spaces could be publicly funded. In the United States, Ethan Zuckerman, the creator of pop-up ads, has proposed taxing for-profit social media companies to fund civic-minded platforms. 

Lastly, we need a new model of journalism to address why people are drawn to disinformation and conspiracy theories in the first place. Much effort is spent countering false narratives, but less attention is given to why people seek them out. Research shows that conspiracy theories thrive among those who feel powerless and disconnected. Online conspiracy communities offer a false sense of belonging and purpose, ultimately serving the interests of their orchestrators. Instead of focusing solely on fact-checking, journalism should work to restore a sense of agency and engagement. 

Jeff Jarvis, former head of Journalism Innovation at the Craig Newmark Graduate School of Journalism at CUNY argues for a shift in journalism’s role—from merely reporting facts to serving as a social tool that responds to people’s frustrations and feelings of abandonment. This approach mirrors an older, 18th-century vision of the public sphere, where journalism was less about strict objectivity and more about empowering the marginalized. It requires rethinking news coverage and impact—mere clicks should not be the metric of success. It also involves cross-disciplinary collaboration, such as partnering with lawyers to push for justice, rather than just reporting on wrongdoing. Traditional journalism will always have its place, but new, proactive forms must emerge alongside it. 

The collapse of the public sphere poses a challenge for any country. However, Moldova feels this pressure most acutely. Situated between Ukraine and Romania, this nation of three million faces relentless Russian influence operations, turning it into a large-scale testing ground for subversion. Russia and its proxies have spent an estimated $200m in 2024, including vote-buying schemes; covert and over propaganda campaigns; co-opting religious groups, funding parties, cyber attacks and many others tools. 

Yet Moldova could also serve as a model for how democracies defend themselves against such attacks while staying true to their values.  

To counteract the surge of online disinformation campaigns, Moldova intends to collaborate with EU enforcement teams under the Digital Services Act (DSA). This legislation aims to require tech companies to reduce the harmful impacts of online manipulation on civic discourse and elections. However, since the law is still new, its precise implications and the extent of compliance from tech firms remain uncertain. Ideally, tech companies would establish rapid-response units to tackle subversive activities, swiftly and transparently disclose the sources of propaganda campaigns, and provide researchers with access to relevant data. This would help prevent situations like the one in Romania, where the origins of a pro-Kremlin candidate’s campaign were unclear. 

Moldova has also introduced a law regulating broadcast media to curb disinformation deemed a threat to national security. Drafted in collaboration with the Council of Europe over a year, the law seeks to balance national security concerns with freedom of speech. However enforcement is slow, involving initial warnings and fines for media outlets running strategic disinformation campaigns. While it may help reduce long-term Russian influence efforts, it is not a foolproof solution. 

Yet legislative measures alone cannot eliminate subversive campaigns in a democracy. Engaging with the audiences exposed to Russian propaganda is equally crucial. Many of these audiences are primarily Russian speakers who receive little content from mainstream media. They often hold an idealized perception of life in “Great Russia” and accept the Kremlin’s narrative about the so-called decay of “Gayeuropa.” To counter this, a new generation of communication strategies is needed—ones that address the deep-seated emotional concerns and fears exploited by Kremlin propaganda. Simply fact-checking and debunking myths about the EU is insufficient. Instead, efforts should focus on tackling the alienation and identity crisis that Russian disinformation feeds on. This means investing in diverse forms of content beyond news, such as entertainment programs and community discussions, to foster a more informed and engaged public. 


Peter Pomerantsev is a senior fellow at the SNF Agora Institute at Johns Hopkins University where he co-directs the Arena Initiative. Between 2017-2020, he was a senior fellow at the London School of Economics and Political Science where he was the director of the Arena Initiative, a research project dedicated to overcoming the challenges of digital era disinformation and polarization. His book on Russian propaganda, Nothing is True and Everything is Possible, won the 2016 Royal Society of Literature Ondaatje Prize and was nominated for the Samuel Johnson, Guardian First Book, Pushkin House and Gordon Burns Prizes.