The future of the internet is on the line this week with the Supreme Court hearing a major case surrounding content moderation on platforms like Facebook, YouTube and X, formerly Twitter. This case, along with others being heard by the high court, could fundamentally alter how information is shared online, said GBH social media strategist Zack Waldman.

In one set of cases, known as Moody v. NetChoice, LLC and NetChoice, LLC v. Paxton, the Supreme Court heard arguments around the constitutionality of state laws in Florida and Texas that would limit the ability of social media companies to moderate content on their platforms.

“Republican lawmakers in both states are contending that the social media companies have been too quick to throttle down conservative viewpoints,” Waldman said. “On the other end of the spectrum, trade groups representing the likes of Facebook, YouTube and X counter that those laws should be shut down because they infringe on the company's First Amendment rights to free speech.”

The impact on social media users could be significant, Waldman said.

“It's going to significantly impact how you, me and millions of Americans are getting their news and information and how much spam and hate speech and misinformation they're going to be seeing in their social media feed of choice,” Waldman said.

There’s some precedent for the court, Waldman said, including a 1974 case in which the Supreme Court ruled that Florida government officials could not force the Miami Herald to publish rebuttals to editorials.

But social media platforms are not exactly like newspapers, Waldman said. And the court will have to decide which framework made for older technology will apply.

“The big question here is: Are social media companies more like newspapers, or are they more like telephone operators, in which they kind of are transmitting content generated by customers and not actually creating it?” Waldman said.

“One Florida lawyer in the case said the telephone company, internet service provider, delivery company — they can all be prevented from squelching or discriminating against the speech they carry, and so can the platforms," he said. "I have a little bit of issue with this in terms of making that analogy, because, you know, two people having a one-on-one phone call conversation seems to me like a little bit different than somebody posting on X or Facebook or YouTube and disseminating information or messaging to thousands, maybe even millions of people that are viewing that post.”

There are also competing political forces at play, Waldman noted: As this case in which governments are asking platforms for less content moderation works its way through court, social media platforms are also facing calls for more moderation of the content teenagers and other young users are exposed to.

“Very recently, leaders from several of the social media companies that are kind of in question on these Supreme Court cases just went in front of a Senate Judiciary Committee and were grilled by lawmakers about what more they could be doing to better protect teens from the harms of their apps,” Waldman said. “And now, separately, in the Supreme Court, you have two state laws that are being reviewed by the high court that could diminish the power of those platforms to moderate content. To me, it's like, what is this? If you're Congress or government officials on the state and federal level, do you want more moderation or less moderation from these platforms?”

The Supreme Court is expected to release its decision in the case by the end of June.