Public Copy of 2025 Platform Regulation Syllabus

notion image
Tab 1
Foundations of Internet Speech Platform Regulation
Winter Quarter, 2025
Stanford Law School
Instructor: Daphne Keller

Schedule and assignments

Day 1 - January 7 - Introduction We will spend time on course logistics, and on an overview of topics the class will explore.

Reading:
  • California’s Age Appropriate Design Code, subsection 1798.99.31(a). Don’t focus too much on the details, especially since this part of the law has been held unconstitutional. Ask yourself what a platform should do to comply with the law as a practical matter.
  • Optional: NetChoice v. Bonta, 9th Circuit (2024) (upholding on First Amendment grounds an injunction against enforcing much of the AADC)

Day 2 - January 9 - Conceptual Foundations for Platform Regulation

We will discuss the Lessig article as a frame for questions about how the law reconciles technical, commercial, and operational realities with societies’ policy goals or constitutional mandates. This will likely be the most theory-oriented reading of the whole quarter.
Reading:
  • Larry Lessig, The Law of the Horse (main text, you can skip the footnotes). Don’t worry about the specific laws or technologies he discusses. Focus on the arguments about regulation, power, and constitutional rights, and how the Internet might change legal thinking on these topics.

Day 3 - January 14 - Speech and Content Moderation at Scale

Moderating or regulating speech on Internet platforms is very different from the adjudication of individual cases that courts typically engage in. Differences include scale, the impact of ex ante product design decisions, and the availability and capacities of technical tools.
Reading:
  • On history and mechanics: Kate Klonick, The New Governors, pages 1599-top of 1658 (skip the footnotes)
  • Optional: Browse the great library of materials maintained by the Trust and Safety Professionals Association

Day 4 - January 16 - Merits Claims Against Platforms

If platforms had no immunities, and plaintiffs could bring content-related claims against platforms, what would happen? Understanding the “merits” or substantive law obligations in such cases (as opposed to procedural issues or immunity questions) can help us understand what that world would look like. Merits law also matters because substantive legal claims actually get litigated against platforms far more than many people imagine.
Reminder: Paper proposals due by midnight January 17.
Reading:
  • Excerpts from Robert Hamilton, “Defamation” chapter in Kent Stuckey, Internet & Online Law (1996): read Table of Contents and Sections 2.03(3)(a) ("Publication" and ‘Republishers’ at Common Law”) and 2.03(3)(b)(i) (“Distributors and ‘Reason to Know’")
  • Federal Criminal Statutes here
  • Criminal Material Support of Terrorism statutes: 18 U.S.C. 2239A (read all), 18 USC 2239B (read only part (a))
  • Optional: CRS report explaining these statutes and breaking down elements
  • Federal Criminal Aiding and Abetting statute, 18 U.S.C. 2

Day 5 - January 21 - Copyright

Copyright is one of the most important areas of law for lawyers advising U.S. platforms. U.S. copyright damages are extremely high, litigation is common, and intellectual property claims are not immunized by CDA 230. Instead, they are conditionally immunized under the “notice and takedown” system of Section 512 of the Digital Millennium Copyright Act (DMCA). Practical and legal precedents from the DMCA are often relevant for non-copyright questions in the U.S. and around the world.
Reading:

Day 6 - January 23 - Making Platforms Remove User Speech and the First Amendment (“Intermediary Liability”)

If CDA 230 and other common law or statutory limits on content-based claims against platforms went away, what rules would remain based on the U.S. Constitution or its international analogs?
Reading:
  • Daphne Keller, Internet Platforms, pages 16-20 (Section titled “Speech Consequences and the First Amendment”)
  • Jack Balkin, Free Speech Is a Triangle, Sections I.A through II.B (i.e., everything prior to II.C, “Privatized Bureaucracy”)
  • Optional: For an early and non-political U.S. case assessing state action “jawboning” through pressure on private intermediaries, see Backpage v. Dart
  • Optional: Katie Harbath and Matt Perault, Jawboned

Day 7 - January 28 - Making Platforms Carry User Speech and the First Amendment (“Must-Carry”)

“Must-carry” claims seeking to compel platforms to reinstate content or accounts have existed since the 1990s, but became politicized and much more common in recent years. Until 2022, platforms had won every case. This was usually (1) because plaintiffs could not make out the merits of tort or contractual claims, and had no First Amendment claims against private platforms, (2) because of platforms’ Terms of Service, (3) because of platforms’ immunities under CDA 230, or (4) because of platforms’ own First Amendment rights to set editorial policies.
In 2021, Texas and Florida both enacted must-carry laws, restricting platforms’ discretion to moderate content. The laws also impose transparency requirements, including mandates for platforms to provide individual notices to users about content moderation decisions and (in Texas’s case) allow users to appeal those decisions. Platforms challenged those laws based on CDA 230, the Dormant Commerce Clause, and the First Amendment. The 11th Circuit struck down most challenged provisions of Florida’s law; the 5th Circuit upheld all of Texas’s. The Supreme Court reviewed both cases in a 2024 ruling, Moody v. NetChoice.
Reading:
  • Rough list classifying and linking to major precedent at issue in must-carry cases
  • Optional: Daphne’s NetChoice Legal Arguments and Options tracking doc
  • Optional: USTA v. FCC, denial of rehearing en banc, excerpt from dissent of J. Kavanaugh
  • Optional: Blake Reid, Uncommon Carriage (arguing that there is no single coherent doctrine defining “(1) the classification of “common carriers,” (2) the imposition of “common carriage” rules on those carriers, and (3) the First Amendment problems that flow from the imposition”)

Day 8 - January 30 - Section 230

The CDA has, with the DMCA, been one of the two primary pillars of U.S. Intermediary Liability law for over two decades. It is widely considered to be the law that “made Silicon Valley” or “created the Internet”.
For a decade, it has faced escalating attacks. Today we will be talking about the source of most Section 230 holdings, its immunity for platforms hosting certain unlawful content. This is the source of most early critiques of the law, and most critiques from the political left today. I may refer to these kinds of CDA 230 cases as “must-remove” cases, to distinguish them from the more recent spate of “must-carry” cases.
We will also talk about the "edges" of CDA 230 immunity -- platform actions that are not, or potentially not, immunized.
Reading:
  • Optional: Anything Jeff Kosseff has written about CDA 230
  • Optional: Look at recent developments on Eric Goldman’s blog
  • Optional: FOSTA redline showing changes to previous law

Day 9 - February 4 - Section 230 Continued

Today we will be discussing more “edges” of 230 immunity for unlawful content, as well as CDA 230 as a source of immunity for must-carry claims that challenge platforms' decisions to *remove* lawful content.
Reading:
  • NTIA petition Summary of Argument (pp 3-6) and Appendix A (pp. 53-55), note this petition is a product of the first Trump Administration, and widely expected to be revived in the present administration. Please read with an eye to arguments you would make as a lawyer representing must-carry plaintiffs if this proposal became law.
  • Fyk v. Facebook (unpublished 9th Cir. 2020) (NOTE - This is an example of a court applying 230(c)(1) in a must-remove case, and explaining why this does not render 230(c)(2) mere surplusage. After assigning it, I realized that the ruling doesn’t explain the context very well.)
  • Daphne Keller, FAQs about the NetChoice Cases at the Supreme Court Part 1, Questions 2(b) and 2(c)

Day 10 - February 6 - International Models

The U.S. adopted Internet regulations earlier than many other countries, and some key laws have been largely unaltered since the 1990s. Many other countries have adopted newer laws in the interim, or rendered court rulings on topics that in the U.S. have been largely foreclosed by statutes. In this class we will discuss international regulatory models and court rulings.
Readings:
  • EU Digital Services Act
  • DSA Articles 4-9, 14-18, 20-23, 27, 33-38, and 40
  • Optional: House Judiciary Committee letter to European Commission, January 31 2025
  • Optional: Tarlach McGonagal’s great list of EU and international law resources for Internet and freedom of expression
  • Read the heading of each Article
  • Read in full Articles 7, 8, and 18-21
  • TBD possible updates based on Brazilian Supreme Court

Day 11 - February 11 - Generative AI and Its Platform Law Predecessors

We will examine AI through the lens of platform law, examining “scraping” or data collection for AI training as well as questions that arise for platforms about content posted or created by users.
Readings
  • Computer Fraud and Abuse Act, 18 U.S.C. 1030 excerpts. For this course we are concerned about civil claims only. Don’t worry about anything involving government computers or criminal claims.

Day 12 - February 13 - Platform Ranking Algorithms

Legal theories abound about the regulation of algorithms that platforms use to rank user content. Should they create liability for platforms in some circumstances? How do they interact with immunity statutes? Is algorithmic output constitutionally protected speech by platforms? Today we will examine those questions.
Reading:
  • How ranking algorithms work
  • Amicus Brief of the Integrity Institute, Gonzalez v. Google, pages 3-9
  • Tort liability
  • 230 immunity
  • Amicus Brief of Cox and Wyden, Gonzalez v. Google (for drafters’ assertions about legislative intent, history of 1990s recommendation systems, and 230 analysis including good review of basic 230 analysis)
  • First Amendment
  • Review Moody fn 5 and Barrett concurrence on algorithmic rankings

Day 13 - February 18 - Child Safety and Age Verification

Many lawmakers want platforms to exclude children, show them different content, or otherwise provide them with a different online experience. In order to do so, platforms must ascertain which users are children. The mechanics and constitutionality of age “verification” or “assurance” have been debated and litigated in courts around the world since the early 2000s. This year, a case about age verification -- FSC v. Paxton -- is going to the Supreme Court. It involves pornography sites, but the constitutional questions it addresses are directly relevant for new platform laws enacted by U.S. states and challenged in ongoing First Amendment litigation in lower courts.
Guest Speaker (first half of class)
Research Paper Work-in-Progress Presentations (second half of class)
Reading:
  • Outlines from students presenting research
  • Free Speech Coalition v. Paxton, 5th Circuit ruling. Be sure to read both majority and dissent, since the dissent explains the most relevant Supreme Court precedent.

Day 14 -- February 20 -- Content Takedown Across Borders

Research Paper Work-in-Progress Presentations (second half of class)
When can courts in one country impose their content removal requirements globally? When (and how) can they order national infrastructure providers like ISPs to block content from outside the country?
Reading:
  • Outlines from students presenting research
  • Google v. Equustek (Supreme Court of British Columbia, 2018) (read with a focus on the relevance of the U.S. order)

DAY 14.5: Makeup Class - February 21 - Platform Transparency

Recent years have seen a slough of new attention to transparency about platforms’ management and moderation of user content, including many new enacted or proposed laws. We will discuss the goals, mechanics, and challenges of platform transparency.
Reading:
  • Optional: slides on DSA and US transparency laws

Day 15 - February 25 - Disinformation, Polarization, and Elections

Reading:
  • Chapter 2 (Andrew M. Guess and Benjamin A. Lyons)
  • Chapter 3 (Pablo Barberá)
  • Chapter 8 (Chloe Wittenberg and Adam J. Berinsky)

Day 16 - February 27 - Class canceled

Day 17 - March 4 - Converging Legal Regimes

Research Paper Work-in-Progress Presentations (second half of class)
Issues at the intersection of legal specialties – including speech law, privacy law, and competition law -- are under-examined in the literature and in legal proposals. We have a patchwork of (mostly short) reading assignments this week, and options for people who want to dig deeper.
Reading:
  • Outlines from students presenting research

Day 18 - March 6 - Class Review and Exam Discussion

Research Paper Work-in-Progress Presentations (second half of class)
Reading
  • Outlines from students presenting research
  • Review:
  • Federal Criminal Statutes availablehere
  • ----------------

Other Class Information

Response Comments and Questions

Students are responsible for short responses to the readings on five different days. I encourage you to use this as an opportunity to raise questions! You do not need to be offering an answer or a conclusion. For students taking the exam, one of your responses should be to the review materials for the March 6th class review. Earlier in the term, I encourage prioritizing responses to the January 16-February 13th readings, since those are particularly likely to be emphasized in the exam.
Comments and questions can be in the form of (1) a post on Canvas of maximum 200 words, or (2) a comment or question (also maximum 200 words, but shorter is great) using Google Docs for the reading materials that are in Docs. I strongly encourage using Docs, because I am hoping that the annotations can become a shared resource for students during exam prep. However, please also post on Canvas (saying “I put a comment in the 17 USC 512 doc” for example) so I know to go look, and so I have a record of all student responses in Canvas for later grading. I am creating Discussion entries in Canvas for each class, please post comments as responses to those.

Paper Instructions

I am experimenting with combining paper and exam options, with some resulting fairly rigid rules. Please be advised that the paper option includes presenting your research to the class at a midway point in the process. It has the following deadlines.
  • Up to ten students can write papers. If you want to write a paper, email your proposed topics with the subject line “Class Paper Topic Proposal” by midnight January 16.
  • Outlines are due by midnight February 11. This should reflect some research and thinking about your topic and should be substantive. However, your outline can be informal, it can include open areas for additional research, and it is not a commitment to the final paper structure. Think of it as a step in both the research and writing process.Here is a sample outline that includes a research agenda and suggestions.
  • Drafts are not mandatory. If you want to have me review a draft, it is due by Midnight on March 1.
  • Presentations are to be done in class on February 18th or 20th, or March 4th or 6th. Students should circulate their current outlines to the class two days before the presentation. If it is not the same outline you sent to me, please let me know. This is an opportunity to discuss your research, preliminary conclusions, and open questions, and get feedback from peers. It need not reflect final conclusions or paper structure. Presentations should last about 15 minutes including feedback time. I will post a sign-up sheet for dates. If you wind up with a date you don’t like, you can negotiate a trade with another student -- just let me know if you do so.
  • Final papers are due by Monday, March 31. They must be at least 26 pages, and no more than 30. (Double spaced, normal margins, 12 point font.)

Exam

The exam will be a one-day, self-scheduled take-home test. It will be open book and open Internet. Generative AI use for drafting assistance is permitted. Do not ask GenAI for substantive answers. It is against the rules and the answers are very likely to be wrong.
My exams typically have issue-spotting components and components that are more policy-oriented. All class material is eligible for inclusion in the exam, but the black letter law from the January 16th - February 13th classes is particularly foundational for the class, and particularly likely to appear in the exam.

Generative AI Policy

You may use Generative AI for drafting assistance in any written work, including the exam. For the exam or paper, please state in the document whether and how you have used AI tools. You may not use AI to provide actual substantive analysis or answers to questions. In addition to violating class rules, this would be risky given AI’s very high failure rate in this area of law.

Notes on Reading Material and Classroom Discussion

Difficult content: In this class we will often ask “what online content is so bad that platforms should take it down, as a matter of legal obligation or moral responsibility?” Mostly we will consider those questions in the abstract. But sometimes we will look at cases or discuss hypothetical examples of speech that is upsetting, offensive, or harmful.
Workload: The reading for the first few classes is light, because I find that shopping period disrupts reading and students often miss things. It gets heavier in some later classes.
Assigning my own work: I assigned a lot of my own writing, sometimes for the ideas and sometimes because it concisely explains background information. I don’t love being that teacher who assigns her own work, and am very open to recommendations for substitute material for future classes.
Public and private materials: Many readings are in Google docs. These may have annotations from your fellow students and should not be shared outside of class. Feel free to make your own copies of the docs themselves for other purposes.