News update
  • UNRWA Report on Humanitarian Crisis in Gaza and West Bank     |     
  • Humans Can't Survive Without a Healthy Ocean: UN Envoy     |     
  • Seaweed’s Power: One Man’s Mission to Save the Planet     |     
  • Khudi Bari in shortlist for Aga Khan Architecture Award     |     
  • During Eid hospitals rest and patients languish sans care     |     

Rethinking Digital Platform Design: A Systems Approach

By Lisa Schirch Technology 2025-06-04, 10:19pm

11-6512bd43d9caa6e02c990b0a82652dca1749053944.jpg

Credit: MarcoVector/ shutterstock.com



In 2025, we find ourselves at a turning point in the digital age. Online platforms have become the modern-day public square, yet rather than nurturing democracy, dignity, and informed discourse, many digital platforms now amplify polarisation, disinformation, and manipulation—all in the name of profit.

A new report by the Council on Technology and Social Cohesion, titled “Blueprint on Prosocial Tech Design Governance”, offers a transformative vision for how digital spaces can be reshaped to serve the public good. It presents a comprehensive systems-level strategy to move beyond the fragmented and reactive approaches that have dominated tech reform efforts thus far.

The harms we associate with digital platforms—rampant misinformation, addictive scrolling, toxic interactions—aren’t accidents. They are the direct outcomes of intentional design choices. Infinite scroll, algorithmic recommendation engines that reward outrage, and dark patterns that nudge users into unintended behaviours all prioritise engagement and profit over well-being, accuracy, and social trust.

Despite this, major tech companies often shift the blame onto users, suggesting that harmful content is simply a result of bad actors or poor choices. The Blueprint challenges this narrative by turning attention upstream—toward how platforms are designed in the first place.

Technology is never neutral. Platform design determines what users can and cannot do, and more subtly, what they are encouraged, nudged, or manipulated into doing. Just as architects use building codes to ensure physical spaces are safe and accessible, the Blueprint proposes a certification framework—a kind of "building code" for the digital world.

The proposed tiered certification model introduces five levels of ambition for platform design, ranging from basic safety to advanced democratic participation:

Tier 1 focuses on baseline protections such as Safety by Design, Privacy by Design, and User Agency by Design. These features empower users with more control over content visibility, data tracking, and the ability to opt out of manipulative features.

Tier 2 enhances user experience with tools like empathy-based reaction buttons, prompts to reduce impulsive posting, and reflection nudges before sharing content.

Tier 3 replaces divisive, engagement-maximising algorithms with prosocial alternatives that elevate diverse ideas and highlight common ground.

Tier 4 introduces civic and deliberative platforms purpose-built for democratic engagement, while Tier 5 envisions middleware solutions that give users data sovereignty and ensure interoperability across digital systems.

Central to this model is the idea that independent research and transparency must underpin any meaningful change. The Blueprint calls for:

Mandatory platform audits to reveal how algorithms function.

Safe harbour protections for researchers to investigate harms without fear of retaliation.

Open data standards to measure societal outcomes like trust, well-being, and pluralism.

The Blueprint doesn’t stop at design—it also addresses the market forces fuelling toxic digital environments. Current funding models, especially venture capital, incentivise scale, profit, and user retention over ethical or democratic outcomes. This entrenches a cycle of antisocial design.

The report warns that market concentration among a few dominant tech giants stifles innovation and prevents ethical alternatives from gaining traction. These monopolies control not just content but the infrastructure, data, and monetisation channels.

To rebalance the ecosystem, the Blueprint recommends:

Codifying legal liability for harms caused by platform design.

Enforcing antitrust laws to open the field for ethical competition.

Exploring alternative funding models that support prosocial innovation.

Too often, tech regulation has been fragmented, reactive, and inadequate. By applying a systems lens, the Blueprint provides a proactive roadmap for real change—ensuring that platforms are built not just to profit, but to serve society.

It calls on governments to adopt tiered certifications that reward responsibility, on investors to back prosocial innovation, and on designers to put the needs of marginalised communities at the centre of the user experience.

At its heart, the Blueprint argues that platform design is a form of social engineering. Just as cities are shaped by urban planning, digital environments are shaped by intentional design. Today, they amplify outrage and division—but they could foster empathy, collaboration, and truth.

What remains is political will. The tools now exist to redesign digital platforms for the public good. The question is whether societies will act on them.

Dr. Lisa Schirch is a Research Fellow at the Toda Peace Institute and holds the Richard G. Starmann Sr. Endowed Chair at the University of Notre Dame’s Keough School of Global Affairs. She directs the Peacetech and Polarization Lab and has authored eleven books, including The Ecology of Violent Extremism and Social Media Impacts on Conflict and Democracy. Her work focuses on leveraging technology to improve state-society relationships and build social cohesion.