We share an infrastructure for knowing whom to trust. It consists of signals, heuristics, and institutions that help us distinguish reliable communication from noise: credentials, track records, the coherence of arguments, the effort visible in careful work. This infrastructure is a commons—a shared resource that no one owns but everyone depends upon.

Like any commons, it faces a structural problem. Producing sophisticated, well-argued text once required genuine expertise—the cost of production correlated with quality. That correlation was the foundation of our filtering heuristics. Large language models have broken it. Fluent prose now signals access to tools, not understanding. Each individual who uses AI to simulate expertise they lack gains a private benefit while externalizing a small cost to the shared infrastructure of trust. Aggregate enough of these individually rational choices, and the commons degrades.

This is the tragedy of the epistemic commons—not a moral failing, but a structural dynamic.


Elinor Ostrom won the Nobel Prize for showing that commons don’t always collapse. Communities govern shared pastures, fisheries, and irrigation systems for centuries—without privatization, without state control. Her research identified eight design principles that distinguish successful commons governance from failure.

If epistemic trust is genuinely a commons under structural threat, Ostrom’s principles become a framework for questions we don’t yet know how to answer.

1. Clearly defined boundaries Who belongs to the community using the resource? What are the resource’s limits?

Who counts as a participant in the epistemic commons? Everyone who reads and writes? Professionals in knowledge-intensive fields? And what exactly is the shared resource—trust? Attention? The reliability of quality signals?

2. Rules match local conditions Governance must fit the specific resource and community.

What “local conditions” exist in epistemic communities? Academic publishing differs from journalism differs from corporate communication. Can there be general principles, or only domain-specific norms?

3. Users participate in rule-making Those affected by rules have voice in creating them.

Who should set norms for AI use in communication? Platform owners? Professional associations? Emergent community practice? What legitimacy would any such rules have?

4. Monitoring Compliance must be observable, by users or accountable parties.

Can AI-assisted production even be monitored? Detection tools exist but are unreliable. Is transparency (voluntary disclosure) the only viable monitoring mechanism? What makes disclosure credible?

5. Graduated sanctions Violations meet proportional consequences.

What would proportional consequences for epistemic free-riding look like? Reputational damage? Exclusion from communities? And who would enforce them?

6. Low-cost conflict resolution Disputes need accessible resolution mechanisms.

When someone is accused of passing off AI work as their own, what process adjudicates? Current examples (academic misconduct cases, journalism scandals) suggest we lack good mechanisms.

7. Right to organize recognized by external authorities Self-governance requires external legitimacy.

Would legal systems, platforms, or institutions recognize and support community-developed norms around AI transparency? Or would they override them?

8. Nested enterprises For larger systems, governance occurs at multiple scales.

The epistemic commons operates globally. Can nested governance—local norms embedded in broader frameworks—work for something this distributed?


Ostrom’s cases were pastures, fisheries, irrigation. Whether her framework transfers to something as abstract as epistemic trust is genuinely uncertain. But her principles at least tell us what questions to ask.


Drafted by Claude (Opus 4.5). Full documentation: MicheleLoi/epistemic-commons-ai- For background on costly signaling collapse: Lecture on Trust in the Age of LLMs Picture generated with ChatGPT.

Leave a Reply

Your email address will not be published. Required fields are marked *