A Better Internet for Democracy
The digital era has given rise to a new public sphere, but it is being used to undermine democracy and dignity. The Council on Technology and Social Cohesion’s “Blueprint on Prosocial Tech Design Governance” offers a comprehensive response to this crisis.
Digital Harms: A Deliberate Choice
- Infinte scroll, addictive recommendation systems, and deceptive patterns are not technical inevitabilities, but design policies that reward engagement over truth, attention over well-being, and outrage over dialogue.
- Digital harms have devastating consequences: eroding mental health, fuelling polarisation, spreading disinformation, and concentrating power in a handful of corporate actors.
Instead of blaming users, tech companies must take responsibility for their own design choices. The Blueprint shifts the focus from downstream content moderation to upstream platform design.
Prosocial Building Codes
No technology has a neutral design. Companies make choices about what a platform will allow, prevent, or persuade people to do or not do online.
| Tier | Description |
|---|---|
| Tier 1 | Establishing baseline protections: Safety by Design, Privacy by Design, and User Agency by Design. |
| Tier 2 | Low-barrier user experience tools like empathy-oriented reaction buttons, friction to slow down impulsive posting, and prompts to reflect before sharing. |
| Tier 3 | Prosocial algorithms that highlight areas of common ground and diverse ideas, replacing engagement-maximising recommender systems. |
| Tier 4 | Civic tech and deliberative platforms explicitly built for democratic engagement. |
| Tier 5 | Middleware solutions that restore data sovereignty and interoperability. |
Research Transparency and Protections
The report highlights the need for research to understand how platform design impacts society, safe harbour laws to protect independent researchers, and open data standards for measuring social trust and cohesion.
“Without these safeguards, crucial insight into systemic harms—such as manipulation, bias, and disinformation—remains inaccessible.”
Shifting Market Forces
The report concludes with a set of market reforms to shift incentives toward prosocial tech innovations.
- Codifying liability for platform-induced harms
- Enforcing antitrust to level the playing field for ethical alternatives
- Identifying a range of options for funding and monetising prosocial tech startups
Market concentration inhibits innovation and confines users within systems that prioritise profit over well-being. The report recommends shifting market forces to make prosocial tech not only possible, but competitive and sustainable.
A Call to Action
The Blueprint gives us the tools. The next step is collective action for governments, technologists, and civil society alike.
Dr. Lisa Schirch, Research Fellow with the Toda Peace Institute, is on the faculty at the University of Notre Dame in the Keough School of Global Affairs and Kroc Institute for International Peace Studies.
She holds the Richard G. Starmann Sr.
