DAILY SIGNAL: Supreme Court to Decide Future of Section 230 Protection for Social Media Platforms

Social media companies might have to make significant changes to their platforms, depending on the outcomes of two Supreme Court cases.

This week, the justices hear arguments in two cases that involve a federal law referred to as Section 230, which protects social media platforms from being held liable for the content users post.

The high court heard arguments in Gonzalez v. Google LLC on Tuesday and will hear Twitter Inc. v. Taamneh on Wednesday. The rulings in these cases, expected sometime this summer, could affect how social media platforms operate.

Under Section 230, platforms such as Facebook and Twitter are shielded from standards that, for example, newspapers are held to. While a news outlet can be sued for knowingly publishing false information, the same strict standards don’t apply to social media sites, due to Section 230 of the Communications Decency Act of 1996.

But some argue that tech platforms should be held accountable for false or dangerous information shared on the platforms. The two cases being brought before the Supreme Court this week “should be a very interesting debate, and it may not be one that divides cleanly along ideological lines, like we typically think of them, between the conservative and more liberal justices,” says Zack Smith, manager of the Supreme Court and Appellate Advocacy Program in The Heritage Foundation’s Edwin Meese III Center for Legal and Judicial Studies. (The Daily Signal is the news outlet of The Heritage Foundation.)

Smith, who is also co-host of the “SCOTUS 101” podcast, joins “The Daily Signal Podcast” to discuss how social media platforms could be affected by the cases being argued at the Supreme Court this week.

Listen to the podcast below or read the lightly edited transcript:

Virginia Allen: Joining us today is Zack Smith, a Heritage Foundation legal fellow and co-host of the “SCOTUS 101” podcast. Zack, welcome to the show.

Zack Smith: Of course. Thank you for having me on, Virginia.

Allen: So let’s start, Zack, with that big picture. Explain a little bit further, if you would, what exactly Section 230 says and really what the point of the law is.

Smith: Sure. Well, Section 230, it’s part of the Communications Decency Act of 1996, and it basically says that publishers of content on the internet, like social media companies, Facebook, Twitter, Google, and others as well, that if they’re simply publishing materials that others have posted, in general they cannot be liable for any harm caused from that content.

And Section 230 has really been referred to as one of the fundamental building blocks of the modern internet. It’s what allowed, has allowed search engines to proliferate, what’s allowed social media companies to proliferate.

But there’s been a feeling in recent years that maybe Section 230, at least the way the courts have interpreted it, has gone too far, and that there should be some limits or some liability that attaches to the content posted on the internet, particularly where many of these Big Tech companies are doing something that looks like content moderation, choosing to highlight certain search results, choosing to de-emphasize or even flag certain content as potentially harmful or misleading.

And so this has been a major debate, not only in the tech community, but also in courtrooms across the country as more and more tech companies [facing] lawsuits are increasingly invoking Section 230 as a shield to liability.

Allen: Well, it certainly is true that social media has changed a lot since Section 230 was first implemented in 1996. We’ve seen so many changes and so many new social media platforms arise.

So why is Section 230 being challenged right now? What is the Supreme Court considering, first off, in this case Tuesday? We have a situation where a California family sued Google and YouTube over the tragic death of their 23-year-old daughter during an ISIS terrorist attack in Paris. Why does this family argue that Google and YouTube are somehow responsible for their daughter’s death?

Smith: So, it’s a really interesting question, Virginia. And basically, there are two cases that the court’s considering together, one on Tuesday, one on Wednesday where Google and Twitter are both being sued under something called the Anti-Terrorism Act—the ATA, as it’s commonly known.

The Google case, as you mentioned, the victim in that case was killed in a Parisian terrorist attack. In Twitter, the victim was killed in a terrorist attack that took place in Turkey. But both families are essentially claiming that Google and Twitter should be liable under the ATA, this Anti-Terrorism Act, because they essentially knowingly aided or abetted terrorist organizations in committing these acts.

Now, the Google case that you mentioned is particularly important and particularly interesting because Google is invoking Section 230 as a shield to liability. They’re saying that they should not be held liable, that they cannot be held liable under Section 230 because they were essentially just reposting content that others had provided to them.

Now, the specifics in this case are really fascinating because essentially what the family in the Google case is arguing is that by recommending certain YouTube videos—if you go on YouTube, a lot of times YouTube will recommend follow-on videos that will often start auto-playing after the video you watch has finished.

And so the families are arguing that because of that, Google should be liable under the ATA and that Section 230 doesn’t provide a defense to them because really what they’re doing in that instance looks something more like content moderation, something more like what you would see in a traditional content curation process that is distinct, at least the families argue, from simply reposting the videos of third parties.

Allen: So, Zack, given your legal expertise, what are the questions that the Supreme Court justices are asking? Give us, from their perspective, how are they thinking about these cases and what are they considering as they hear arguments both heard on Tuesday and hearing today on Wednesday?

Smith: Well, I’m very hesitant to make any predictions these days, Virginia, particularly at the Supreme Court. But look, we’ve seen in the past some justices, particularly Justice Clarence Thomas, has expressed some skepticism about the scope of Section 230 as it’s currently being interpreted by the lower courts.

Justice Thomas had pushed for the Supreme Court to take up a case to essentially resolve some of the conflicts involving lower court interpretation of Section 230, the scope of Section 230. And so I think you should see and anticipate seeing some tough questioning from Justice Thomas and some of the other conservative justices as well.

Now, it is interesting, apparently Justice Neil Gorsuch in the Google case was feeling under the weather, and so he participated in those arguments telephonically, but I anticipate the justices will have tough questions for all parties involving the scope of Section 230.

One other interesting wrinkle I’ll mention, Virginia, is in the Google case, essentially Google and in the lower courts, Facebook and Twitter and some others were parties to the case as well. They’re urging the court to decide this case simply on the scope of the ATA, basically saying, “If you find that these tech companies cannot be sued under the Anti-Terrorism Act, there’s no need for the court to reach the Section 230 question.”

And so it’ll be interesting to see if the justices accept that invitation for an offramp or whether they decide to reach the Section 230 question and clarify its scope going forward.

Allen: So, in other words, we could see these cases play out and Section 230 isn’t even necessarily considered, there’s a possibility that we wouldn’t see any changes to Section 230 in the ruling of these cases.

Smith: Yeah, that’s right. And basically, the other thing some of the justices may say is, “Look, it’s not our job, it’s Congress’ job to decide to what extent tech companies should be liable, whether we think we’ve struck the right balance or not. That’s a question for Congress.”

And so I think in that vein, it’ll be very interesting to see what Chief Justice [John] Roberts does. That’s kind of a common refrain that he makes, appropriately so in many cases, that it’s Congress’ job to make policy determinations, not the courts’. And so this should be a very interesting debate and it may not be one that divides cleanly along ideological lines like we typically think of them, between the conservative and more liberal justices. This may be one of the rare instances where there could be some ideological crossover.

Allen: For those who want to see tech platforms held accountable for the content posted on platforms, how do they propose that that be done? I mean, the whole point of social media is that it’s an open marketplace of ideas and of information. So how can that aspect of this open space where free speech is allowed exist and then also you have this element of moderation? I mean, can the two be held together without free speech being violated?

Smith: Well, it’s a fascinating question and I think it really cuts to the core of the question in a lot of cases. But look, we’ve seen two states at least, Texas and Florida, trying to make sure that tech companies are not deplatforming individuals due to their ideological beliefs. Both Texas and Florida passed bills that would essentially limit the ability of social media companies to deplatform or restrict access to certain individuals based solely on the content they’re posting on those websites.

Now, the 5th Circuit Court of Appeals, they upheld Texas’ law that would restrict the ability of social media companies to remove or deplatform certain individuals. And the 11th Circuit Court of Appeals, another intermediate appellate court, they struck down most of Florida’s law that did essentially the same thing but with some key differences. And so the court is currently being asked to review both of those cases as well. It’s called for the views of the Biden administration to see what the Biden administration thinks in both of those cases.

But what many people are saying, there could be a potential conflict here depending on how the court rules in these Google and Twitter cases. If the court scales back the scope of Section 230, the immunity it provides, while at the same time, if it were to uphold Florida or Texas’ laws, people are saying this could put tech companies in a very difficult situation where they are prohibited from restricting certain information posted on their platforms at the same time where their immunity could potentially be scaled back.

So I think there’s a lot left to play out in this space. It’ll certainly be a very interesting set of opinions to watch come out when the court issues them. And I think it’ll also be very important to watch and see whether or not the court decides to take up and review the cases involving Florida and Texas’ social media laws.

Allen: And with all of these cases, is there a chance that the user experience could change in any sort of significant way moving forward? I mean, for our average listener, let’s say there are major changes to Section 230. Are we going to potentially, depending upon the rulings, start seeing a lot more content moderation to where someone posting an opinion on their Facebook page, while we’ve already seen a lot of things taken down, that could just increase?

Smith: Well, it really depends. I think we just have to wait to see how this plays out. I think we have to wait to see what happens with the Texas and Florida laws as well, because again, the gist of those laws is to prevent tech companies from removing certain individuals. And there’s always the possibility that Congress could get involved as well and pass legislation in this area. And so, again, I think there’s a lot of uncertainty in this area right now.

There’s also the backdrop, there’s a push by some members on the court to reevaluate the standard that courts apply when considering defamation claims, how those tie into the court’s First Amendment case law. And so I think we could potentially be in for a few years of uncertainty in this very important area as many of these issues continue to percolate their way through the courts and the tech companies and individual users continue to figure out how to respond to these new changes.

Allen: Well, Zack, Tuesday was just the beginning of a two-week argument session for the justices. Are there any other cases that you’re following closely?

Smith: There are. There’s a couple of big ones that the court still has to hear and decide. The two biggest ones are the ones involving the Biden administration’s attempts to forgive student loan debt. The court is going to hear challenges to that action in the next few weeks as well.

There’s also an important religious liberty case, Groff v. DeJoy, involving whether or not employers have to provide certain accommodations to religious employees. And then there’s a number of other interesting cases as well involving sovereign immunity, involving takings issues. And so even though we are in the back half of the court’s term right now, there is still a lot left for the court to do before they end their term later this year.

Allen: Lots of exciting cases ahead. Zack, we really appreciate your time today and your willingness to join us and break down these little bit wonky cases on Section 230. We really appreciate it.

Smith: Of course. Happy to do it.

Have an opinion about this article? To sound off, please emailletters@DailySignal.comand we’ll consider publishing your edited remarks in our regular “We Hear You” feature. Remember to include the url or headline of the article plus your name and town and/or state.

The post Supreme Court to Decide Future of Section 230 Protection for Social Media Platforms appeared first on The Daily Signal.

Disclaimer: The views and opinions expressed in The Ocean State Current, including text, graphics, images, and information are solely those of the authors. They do not purport to reflect the views and opinions of The Current, the RI Center for Freedom & Prosperity, or its members or staff. The Current cannot be held responsible for information posted or provided by third-party sources. Readers are encouraged to fact check any information on this web site with other sources.

YOUR CART
  • No products in the cart.
0