The dominant approach to addressing information reliability in digital society focuses on technological interventions: tweaking recommendation algorithms, labeling misinformation, promoting “authoritative” sources, and implementing fact-checking systems. This techno-solutionist framework misdiagnoses the fundamental problem and proposes remedies that cannot address the deeper social transformations at work.
The Techno-Solutionist Misdiagnosis
Techno-solutionism frames information reliability primarily as an algorithmic or content moderation problem. Concrete examples include:
- Algorithmic Debiasing: Eli Pariser’s “The Filter Bubble” (2011) initially identified problematic algorithmic filtering, leading to numerous technical proposals for “debiasing” recommendation systems. Meta’s “Meaningful Social Interactions” algorithm adjustment in 2018 exemplifies this approach.
- Fact-Checking Integration: The partnership between platforms like Facebook and third-party fact-checkers to label content, as detailed in Garton Ash et al.’s “GLASNOST! Nine Ways Facebook Can Make Itself a Better Forum for Free Speech and Democracy” (2019).
- Quality Rankings: Google’s EEAT (Experience, Expertise, Authoritativeness, Trustworthiness) framework for ranking search results privileges certain sources based on technical assessments of reliability.
- Nudge Architectures: As advocated by Thaler and Sunstein in “Nudge” (2008) and adopted by platforms to encourage “better” information consumption through interface design.
- Content Moderation Systems: Twitter’s Birdwatch (now Community Notes) and similar distributed moderation systems that aim to crowdsource reliability assessments while maintaining algorithmic oversight.
These approaches, while well-intentioned, fundamentally misunderstand the social nature of knowledge and overlook the deeper transformations in associational life that have altered how reliability is constructed.
The Associational Reality of Knowledge
Drawing on Tocqueville and Rawls, we understand that knowledge has always been socially situated within associations. In a liberal society, diverse associations legitimately develop different approaches to evidence, authority, and interpretation. Rawls’s “burdens of judgment” explain why reasonable people inevitably reach different conclusions, even with access to the same information.
What has changed is not primarily the filtering of information but the nature of associations themselves:
- Dematerialization: As Sherry Turkle documents in “Alone Together” (2011), digital technologies create connections that lack the depth and materiality of traditional relationships. She demonstrates how these impoverished connections fail to satisfy deeper human needs for embodied presence. Similarly, Vincent Miller’s “The Crisis of Presence in Contemporary Culture” (2016) directly addresses how digital media creates “phatic” connections—communications that maintain contact without substantive exchange. These connections lack the materiality that gives traditional associations their formative power.
- Authority Crisis: Gurri’s analysis reveals how digital networks systematically undermine established authorities without creating stable alternatives, creating what he terms “the revolt of the public” against all institutional knowledge claims.
Why Algorithmic Interventions Fail
Algorithmic interventions cannot address these deeper transformations for several reasons:
- Misplaced Focus: They target information distribution rather than the social contexts that give information meaning and credibility.
- Forced Consensus: They attempt to create artificial consensus in a society where Rawls’s burdens of judgment ensure legitimate epistemic diversity.
- Authority Substitution: They replace eroded social authorities with platform-based technical authorities, further weakening authentic associational life.
- Dematerialization Acceleration: Many technical solutions further remove knowledge from material contexts, exacerbating rather than addressing the dematerialization of social life that Turkle identifies as problematic.
- Liberal Contradiction: They often restrict the very associational freedom that liberalism considers essential for developing diverse conceptions of the good.
The Necessary Approach
Rather than algorithmic fixes, addressing the crisis of reliable information requires:
- Strengthening Associational Integrity: Supporting the development of associations with sufficient material presence and internal authority structures to serve as meaningful contexts for knowledge formation
- Accepting Epistemic Pluralism: Recognizing that beyond a minimal factual consensus, diverse associations will and should maintain different epistemic frameworks
- Building Social Infrastructure: Creating opportunities for productive engagement across different epistemic communities without demanding comprehensive agreement
- Balancing Associational Freedom and Public Reason: Respecting associational autonomy while establishing minimal shared standards for claims relevant to public deliberation
The Liberal Reality
A liberal society necessarily contains diverse associations with different approaches to knowledge. Techno-solutionist attempts to engineer consensus through algorithmic means not only fail to address the real crisis but actively undermine the associational freedom essential to liberalism.
The path forward lies not in technical fixes to information flow but in rebuilding the social infrastructure that sustains meaningful associational life. This approach accepts epistemic divergence as appropriate in a liberal society while ensuring that this divergence occurs within associations robust enough to support identity formation, self-respect, and productive cross-associational engagement