An Argument for Real Platform Accountability
Where the Discussion Goes Wrong
Regulatory responses tend to import frameworks from traditional media, which do not map well to platform realities. Treating platforms like publishers raises difficult speech concerns; treating them like infrastructure lets too much off the hook.
The accountability gap widens as platforms grow. A platform with billions of users cannot plausibly claim to be a neutral pipe; the scale of curation decisions is itself a form of editorial power.
What I Would Do
Mandated transparency about algorithmic decision-making would be a meaningful step. Not public exposure of proprietary systems, but audited disclosure to regulators and researchers. This would enable the kind of oversight that is currently impossible.
Platform liability for systemic harms, not individual content, would change incentive structures substantially. Platforms would design for safer systems rather than optimizing for engagement and using legal doctrines as shields against responsibility.