Why do so many people, including both former President Donald Trump and new President Joe Biden, keep talking about getting rid of an obscure law called Section 230?
The short answer is that Section 230, part of the Communications Decency Act of 1996, is the legal underpinning for one of the largest and most consequential experiments in American history.
Since the birth of Big Tech Media 15 years ago, our nearly 250-year-old republic has become a test case. Can a nation’s news and information infrastructure, the lifeblood of any democracy, be dependent on digital media technologies that allow a global free speech zone of unlimited audience size, combined with algorithmic (non-human) curation of massive volumes of disinformation that can be spread with unprecedented ease?
This experiment has been possible because Section 230 grants Big Tech Media immunity from responsibility for the mass content that is published and broadcast across their platforms. A mere 26 words in the bipartisan law were originally intended to protect “interactive computer services” from being sued over what their users post, just like telephone companies can’t be sued over any gossip told by Aunt Mabel to every busybody in town.
But as Facebook, Google, Twitter and other services scaled over time to an unimaginable size, the platforms’ lack of human editors has resulted in a gushing firehose of mis- and disinformation where scandals and conspiracies are prioritized over real news for mass distribution.
As the gripping videos and photos of a pro-Trump mob storming the Capitol make clear, this experiment has veered frighteningly off course. So, President Biden has called for ending Section 230 immunity in order to stop the Frankenstein’s monster this law helped create.
Facebook is no longer simply a “social networking” website — it is the largest media giant in the history of the world, a combination publisher and broadcaster, with approximately 2.6 billion regular users and billions more on the Facebook-owned WhatsApp and Instagram. One study found that 104 pieces of COVID-19 misinformation on Facebook were shared 1.7 million times and had 117 million views. That’s far more than the number of daily viewers on the Wall Street Journal, New York Times, USA Today, ABC News, Fox News, CNN and other major networks combined.
Traditional news organizations are subject to certain laws and regulations, including a degree of liability over what they broadcast. While there is much to criticize about mainstream media, at least they use humans to pick and choose what’s in and out of the news stream. That results in a degree of accountability, including legal liability.
But Facebook-Google-Twitter’s robot algorithm curators are on automatic pilot, much like killer drones for which no human bears responsibility or liability.
Our government must impose a whole new business model on these corporations — just as the United States did, in years past, with telephone, railroad and power companies.
The government should treat these companies more like investor-owned utilities, which would be guided by a digital license that would define the rules and regulations of the business model (Mark Zuckerberg himself has suggested such an approach).
To begin with, such a license would require platforms to obtain users’ permission before collecting anyone’s personal data — i.e., opt-in rather than opt-out.
The new model also should encourage more competition by limiting the mega-scale audience size of these media machines. Smaller user pools could be accomplished either through an anti-trust breakup of the companies or through incentives to shift to a revenue model based more on monthly subscribers, like Netflix or cable TV, rather than on hyper-targeted advertising, which would cause a decline in users. The utility model also should restrain the use of specific engagement techniques, such as hyper-targeting of content, automated recommendations and addictive behavioral nudges (like autoplay and pop-up screens).
I believe we can retain what is good about the internet without the toxicities. It is crucial that regulation evolves in order to shape this new digital infrastructure — and the future of our societies — in the right way.
Steven Hill is the former policy director at the Center for Humane Technology.