• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
MENUMENU
MENUMENU
  • Home
  • About
    • Contact Us
    • FlaglerLive Board of Directors
    • Comment Policy
    • Mission Statement
    • Our Values
    • Privacy Policy
  • Live Calendar
  • Submit Obituary
  • Submit an Event
  • Support FlaglerLive
  • Advertise on FlaglerLive (386) 503-3808
  • Search Results

FlaglerLive

No Bull, no Fluff, No Smudges

MENUMENU
  • Flagler
    • Flagler County Commission
    • Beverly Beach
    • Economic Development Council
    • Flagler History
    • Mondex/Daytona North
    • The Hammock
    • Tourist Development Council
  • Palm Coast
    • Palm Coast City Council
    • Palm Coast Crime
  • Bunnell
    • Bunnell City Commission
    • Bunnell Crime
  • Flagler Beach
    • Flagler Beach City Commission
    • Flagler Beach Crime
  • Cops/Courts
    • Circuit & County Court
    • Florida Supreme Court
    • Federal Courts
    • Flagler 911
    • Fire House
    • Flagler County Sheriff
    • Flagler Jail Bookings
    • Traffic Accidents
  • Rights & Liberties
    • Fourth Amendment
    • First Amendment
    • Privacy
    • Second Amendment
    • Seventh Amendment
    • Sixth Amendment
    • Sunshine Law
    • Third Amendment
    • Religion & Beliefs
    • Human Rights
    • Immigration
    • Labor Rights
    • 14th Amendment
    • Civil Rights
  • Schools
    • Adult Education
    • Belle Terre Elementary
    • Buddy Taylor Middle
    • Bunnell Elementary
    • Charter Schools
    • Daytona State College
    • Flagler County School Board
    • Flagler Palm Coast High School
    • Higher Education
    • Imagine School
    • Indian Trails Middle
    • Matanzas High School
    • Old Kings Elementary
    • Rymfire Elementary
    • Stetson University
    • Wadsworth Elementary
    • University of Florida/Florida State
  • Economy
    • Jobs & Unemployment
    • Business & Economy
    • Development & Sprawl
    • Leisure & Tourism
    • Local Business
    • Local Media
    • Real Estate & Development
    • Taxes
  • Commentary
    • The Conversation
    • Pierre Tristam
    • Diane Roberts
    • Guest Columns
    • Byblos
    • Editor's Blog
  • Culture
    • African American Cultural Society
    • Arts in Palm Coast & Flagler
    • Books
    • City Repertory Theatre
    • Flagler Auditorium
    • Flagler Playhouse
    • Flagler Youth Orchestra
    • Jacksonville Symphony Orchestra
    • Palm Coast Arts Foundation
    • Special Events
  • Elections 2024
    • Amendments and Referendums
    • Presidential Election
    • Campaign Finance
    • City Elections
    • Congressional
    • Constitutionals
    • Courts
    • Governor
    • Polls
    • Voting Rights
  • Florida
    • Federal Politics
    • Florida History
    • Florida Legislature
    • Florida Legislature
    • Ron DeSantis
  • Health & Society
    • Flagler County Health Department
    • Ask the Doctor Column
    • Health Care
    • Health Care Business
    • Covid-19
    • Children and Families
    • Medicaid and Medicare
    • Mental Health
    • Poverty
    • Violence
  • All Else
    • Daily Briefing
    • Americana
    • Obituaries
    • News Briefs
    • Weather and Climate
    • Wildlife

Consequences of Repealing Section 230, the ‘Law That Built the Internet’

May 13, 2025 | FlaglerLive | 1 Comment

Sens. Lindsey Graham, R-S.C., and Dick Durbin, D-Ill., are vocal critics of Section 230.
Sens. Lindsey Graham, R-S.C., and Dick Durbin, D-Ill., are vocal critics of Section 230. (AP Photo/J. Scott Applewhite)

By Daryl Lim

Section 230 of the Communications Decency Act, passed in 1996 as part of the Telecommunications Act, has become a political lightning rod in recent years. The law shields online platforms from liability for user-generated content while allowing moderation in good faith.

Lawmakers including Sens. Lindsey Graham, R-S.C., and Dick Durbin, D-Ill., now seek to sunset Section 230 by 2027 in order to spur a renegotiation of its provisions. The senators are expected to hold a press event before April 11 about a bill to start a timer on reforming or replacing Section 230, according to reports. If no agreement is reached by the deadline Section 230 would cease to be law.




The debate over the law centers on balancing accountability for harmful content with the risks of censorship and stifled innovation. As a legal scholar, I see dramatic potential effects if Section 230 were to be repealed, with some platforms and websites blocking any potentially controversial content. Imagine Reddit with no critical comments or TikTok stripped of political satire.

The law that built the internet

Section 230, often described as “the 26 words that created the internet,” arose in response to a 1995 ruling penalizing platforms for moderating content. The key provision of the law, (c)(1), states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” This immunizes platforms such as Facebook and Yelp from liability for content posted by users.

Importantly, Section 230 does not offer blanket immunity. It does not shield platforms from liability related to federal criminal law, intellectual property infringement, sex trafficking or where platforms codevelop unlawful content. At the same time, Section 230 allows platform companies to moderate content as they see fit, letting them block harmful or offensive content that is permitted by the First Amendment.

Some critics argue that the algorithms social media platforms use to feed content to users are a form of content creation and should be outside the scope of Section 230 immunity. In addition, Federal Communications Commission Chairman Brendan Carr has signaled a more aggressive stance toward Big Tech, advocating for a rollback of Section 230’s protections to address what he perceives as biased content moderation and censorship.

What Section 230 does and how it came about.

Censorship and the moderation dilemma

Opponents warn that repealing Section 230 could lead to increased censorship, a flood of litigation and a chilling effect on innovation and free expression.




Section 230 grants complete immunity to platforms for third-party activities regardless of whether the challenged speech is unlawful, according to a February 2024 report from the Congressional Research Service. In contrast, immunity via the First Amendment requires an inquiry into whether the challenged speech is constitutionally protected.

Without immunity, platforms could be treated as publishers and held liable for defamatory, harmful or illegal content their users post. Platforms could adopt a more cautious approach, removing legally questionable material to avoid litigation. They could also block potentially controversial content, which could leave less space for voices of marginalized people.

MIT management professor Sinan Aral warned, “If you repeal Section 230, one of two things will happen. Either platforms will decide they don’t want to moderate anything, or platforms will moderate everything.” The overcautious approach, sometimes called “collateral censorship,” could lead platforms to remove a broader swath of speech, including lawful but controversial content, to protect against potential lawsuits. Yelp’s general counsel noted that without Section 230, platforms may feel forced to remove legitimate negative reviews, depriving users of critical information.

Corbin Barthold, a lawyer with the nonprofit advocacy organization TechFreedom, warned that some platforms might abandon content moderation to avoid liability for selective enforcement. This would result in more online spaces for misinformation and hate speech, he wrote. However, large platforms would likely not choose this route to avoid backlash from users and advertisers.



A legal minefield

Section 230(e) currently preempts most state laws that would hold platforms liable for user content. This preemption maintains a uniform legal standard at the federal level. Without it, the balance of power would shift, allowing states to regulate online platforms more aggressively.

Some states could pass laws imposing stricter content moderation standards, requiring platforms to remove certain types of content within defined time frames or mandating transparency in content moderation decisions. Conversely, some states may seek to limit moderation efforts to preserve free speech, creating conflicting obligations for platforms that operate nationally. Litigation outcomes could also become inconsistent as courts across different jurisdictions apply varying standards to determine platform liability.

The lack of uniformity would make it difficult for platforms to establish consistent content moderation practices, further complicating compliance efforts. The chilling effect on expression and innovation would be especially pronounced for new market entrants.

While major players such as Facebook and YouTube might be able to absorb the legal pressure, smaller competitors could be forced out of the market or rendered ineffective. Small or midsize businesses with a website could be targeted by frivolous lawsuits. The high cost of compliance could deter many from entering the market.

Reform without ruin

The nonprofit advocacy group Electronic Frontier Foundation warned, “The free and open internet as we know it couldn’t exist without Section 230.” The law has been instrumental in fostering the growth of the internet by enabling platforms to operate without the constant threat of lawsuits over user-generated content. Section 230 also lets platforms organize and tailor user-generated content.

The potential repeal of Section 230 would fundamentally alter this legal landscape, reshaping how platforms operate, increasing their exposure to litigation and redefining the relationship between the government and online intermediaries.

Daryl Lim is Professor of Law and Associate Dean for Research and Innovation at Penn State.

The Conversation arose out of deep-seated concerns for the fading quality of our public discourse and recognition of the vital role that academic experts could play in the public arena. Information has always been essential to democracy. It’s a societal good, like clean water. But many now find it difficult to put their trust in the media and experts who have spent years researching a topic. Instead, they listen to those who have the loudest voices. Those uninformed views are amplified by social media networks that reward those who spark outrage instead of insight or thoughtful discussion. The Conversation seeks to be part of the solution to this problem, to raise up the voices of true experts and to make their knowledge available to everyone. The Conversation publishes nightly at 9 p.m. on FlaglerLive.
See the Full Conversation Archives
Support FlaglerLive's End of Year Fundraiser
Thank you readers for getting us to--and past--our year-end fund-raising goal yet again. It’s a bracing way to mark our 15th year at FlaglerLive. Our donors are just a fraction of the 25,000 readers who seek us out for the best-reported, most timely, trustworthy, and independent local news site anywhere, without paywall. FlaglerLive is free. Fighting misinformation and keeping democracy in the sunshine 365/7/24 isn’t free. Take a brief moment, become a champion of fearless, enlightening journalism. Any amount helps. We’re a 501(c)(3) non-profit news organization. Donations are tax deductible.  
You may donate openly or anonymously.
We like Zeffy (no fees), but if you prefer to use PayPal, click here.

Reader Interactions

Comments

  1. Pogo says

    May 14, 2025 at 8:31 pm

    @So, Copilot, please speak to the following:

    The intersectionality of bf skinner and repeal of Section 230 of the Communications Decency Act.

    Copilot responded:

    B. F. Skinner’s legacy in behavioral psychology—especially his insights into operant conditioning—provides a thought-provoking lens through which to view modern digital governance. Skinner demonstrated that behavior can be shaped by its consequences: rewards increase the likelihood of a behavior recurring, while punishments suppress unwanted behaviors. This idea has influenced areas ranging from education to user experience design, where strategic incentives have long been used to “nudge” behavior in desired directions.

    Online platforms operate in ways that mimic a Skinner box. In today’s digital ecosystems, actions like clicking “like,” sharing a post, or receiving a notification act as reinforcers, encouraging certain types of user engagement. Section 230 of the Communications Decency Act has long underpinned this environment by shielding platforms from liability for third‑party speech. This legal protection gives companies the freedom to moderate content—as they see fit—without the constant threat of lawsuits over user-generated content, essentially allowing them to shape user behavior through algorithmic “reinforcement” without the fear of regulatory backlash.

    Now, with active debates over repealing Section 230, we are poised to witness a potential shift in this conditioning dynamic. If platforms lose this legal safeguard, they may feel compelled to adopt far more aggressive content moderation policies to avoid legal jeopardy. In effect, the “punishment” aspect of behavior modification—swiftly removing or censoring content—could become more pronounced. This kind of over-moderation aligns with Skinner’s concept of punishment shaping behavior; however, in a digital context, it risks creating an environment where users quickly learn to self-censor or alter their behavior to avoid triggering punitive measures. Such a shift might undermine the open exchange of ideas, much like a tightly controlled conditioning chamber, where only behaviors deemed “safe” are tolerated.

    At the intersection of Skinner’s theory and the legal debate around Section 230, we encounter a fundamental question: How do we balance the rights of free expression with the need for accountability and safety online? On one hand, a laissez‑faire digital space—enabled in part by Section 230—allows for robust, if messy, public discourse. On the other, removing Section 230 risks transforming these platforms into environments where the threat of legal action forces a punitive, over‑cautious moderation style. This dynamic mirrors the tension between positive reinforcement and punishment in behavioral conditioning, suggesting that how we choose to “program” our digital environments has deep implications for the diversity and spontaneity of public discourse.

    This intersection invites us to ask further questions: Could moderation strategies inspired by Skinner’s principles—favoring carefully calibrated positive reinforcement over heavy-handed punishment—offer a more nuanced solution? How might digital platforms innovate ways to encourage healthy discourse without resorting to the blunt instrument of absolute liability? And what safeguards might ensure that efforts to curb harmful content do not inadvertently stifle legitimate expression? These questions not only deepen the debate over Section 230’s future but also highlight the broader challenge of designing digital ecosystems that respect both the power of behavioral incentives and the essential value of free speech.

    Thanks Copilot, that checks out.

    Copilot closed:

    Would you like to explore more on potential policy innovations that draw from behavioral psychology, or perhaps dive into historical examples of how environmental contingencies have shaped public discourse?

    Well, sure; but not tonight.

    7

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  • Conner Bosch law attorneys lawyers offices palm coast flagler county
  • grand living realty
  • politis matovina attorneys for justice personal injury law auto truck accidents

Primary Sidebar

  • grand living realty
  • politis matovina attorneys for justice personal injury law auto truck accidents

Recent Comments

  • JC on David Jolly Makes It Official: He’s Running for Governor as Newly-Minted Centrist Democrat
  • Ed P on The Daily Cartoon and Live Briefing: Thursday, June 5, 2025
  • NJ on Poland Veers Right, a Bad Omen for EU, Ukraine and Women
  • Hmmm on Palm Coast Council Approves Hiring of Sheriff’s Chief Strobridge on 4-1 Vote; Staly Addresses Risks
  • Roger on New Cell Towers Planned for Palm Coast Parkway East of I-95 and in Seminole Woods, as Business and Safety ‘Necessity’
  • Skibum on The Daily Cartoon and Live Briefing: Thursday, June 5, 2025
  • Me on David Jolly Makes It Official: He’s Running for Governor as Newly-Minted Centrist Democrat
  • Nobody on NOAA Cuts Are Putting Our Coastal Communities At Risk
  • Judith G. Michaud on David Jolly Makes It Official: He’s Running for Governor as Newly-Minted Centrist Democrat
  • PC stinks of corruption on Palm Coast Council Approves Hiring of Sheriff’s Chief Strobridge on 4-1 Vote; Staly Addresses Risks
  • Sunny on Palm Coast Council Approves Hiring of Sheriff’s Chief Strobridge on 4-1 Vote; Staly Addresses Risks
  • Maxie on Flagler Beach Will Consider Selling Ocean Palm Golf Club to Leaseholder, With Conditional Milestones
  • Samuel L. Bronkowitz on Led by Paul Renner, Board of Governors Rejects Ono’s Appointment as President of UF Over Past Views on DEI
  • Devils way on Palm Coast Council Approves Hiring of Sheriff’s Chief Strobridge on 4-1 Vote; Staly Addresses Risks
  • [email protected] on The Daily Cartoon and Live Briefing: Thursday, June 5, 2025
  • Truth bomb on Florida Law Restricting Ballot Initiatives Survives Court Challenge

Log in