The news is by your side.

What you need to know about the Supreme Court’s arguments on social media laws

0

Social media companies are preparing Monday for Supreme Court arguments that could fundamentally change the way they police their sites.

After Facebook, Twitter and YouTube banned President Donald J. Trump in the wake of the Jan. 6, 2021, Capitol riots, Florida made it illegal for tech companies to ban a candidate for office in the state from their sites. Texas later passed its own law banning platforms from removing political content.

Two technology industry groups, NetChoice and the Computer & Communications Industry Association, have filed a lawsuit to block the laws from taking effect. They argued that the companies have the right under the First Amendment to make decisions about their own platforms, just as a newspaper gets to decide what’s on its pages.

The Supreme Court’s decision in those cases — Moody v. NetChoice and NetChoice v. Paxton — is a major test of the power of social media companies, potentially reshaping millions of social media feeds by giving government influence over how and what stays online.

“What’s at stake is whether they can be forced to transmit content they don’t want,” said Daphne Keller, a lecturer at Stanford Law School who has filed a brief with the Supreme Court in support of the challenge to the technology groups to the states of Texas and Florida. laws. “And perhaps even more importantly: whether the government can force them to distribute content they do not want.”

If the Supreme Court says the Texas and Florida laws are constitutional and take effect, some legal experts speculate the companies could create versions of their feeds specifically for those states. Still, such a ruling could usher in similar laws in other states, and it is technically complex to precisely restrict access to a website based on location.

Critics of the laws say the feeds to the two states could contain extremist content — from neo-Nazis, for example — that the platforms would previously have removed for violating their standards. Or, the critics say, the platforms could ban discussion of anything remotely political by blocking posts on many controversial issues.

Texas law prohibits social media platforms from removing content based on the user’s “point of view” or expressed in the post. The law gives individuals and the state’s attorney general the right to file lawsuits against the platforms for violations.

Florida law imposes fines on platforms if they permanently ban a candidate for state office from their sites. It also bans the platforms from removing content from a “journalistic enterprise” and requires the companies to be honest about their content moderation rules.

Supporters of the Texas and Florida laws, passed in 2021, say they will protect conservatives from the liberal biases they say pervade California-based platforms.

“People around the world use Facebook, YouTube and . one legal assignment. “And like the wire companies of old, today’s social media giants use their control over the mechanisms of this ‘modern public square’ to direct—and often suppress—public debate.”

Chase Sizemore, a spokesman for Florida’s attorney general, said the state “looks forward to defending our social media law that protects Floridians.” A spokeswoman for the Texas attorney general had no comment.

They now decide what will and will not remain online.

Companies like Meta’s Facebook and Instagram, TikTok, Snap, YouTube and X have long policed ​​themselves and set their own rules for what users can say, while the government takes a hands-off approach.

In 1997, the Supreme Court ruled that a law regulating indecent speech on the Internet was unconstitutional, distinguishing the Internet from media where the government regulates content. For example, the government enforces decency standards on television and radio broadcasts.

For years, bad actors have flooded social media with misleading information, hate speech and harassment, prompting companies to come up with new rules over the past decade, including banning false information about elections and the pandemic. Platforms have banned figures like influencer Andrew Tate for violating their rules, including against hate speech.

But there has been a right-wing backlash to these measures, with some conservatives accusing the platforms of censoring their views – even prompting Elon Musk to say he wanted to buy Twitter in 2022 to help ensure users’ freedom of expression.

Thanks to a law known as Section 230 of the Communications Decency Act, social media platforms are not held liable for most content posted on their sites. So they face little legal pressure to remove problematic posts and users who violate their rules.

The tech groups say the First Amendment gives companies the right to remove content as they see fit because it protects their ability to make editorial choices about the content of their products.

In their lawsuit challenging the Texas law, the groups said that, similar to a magazine’s publishing decision, “a platform’s decision about what content to host and what to exclude is intended to convey a message about the type of community the platform hopes to foster. ”

Still, some legal scholars worry about the consequences of allowing social media companies unlimited power under the First Amendment, which is intended to protect both freedom of speech and the press.

“I worry about a world in which these companies are invoking the First Amendment to protect what many of us believe are commercial activities and conduct that are not expressive,” said Olivier Sylvain, a professor at Fordham Law School and until short senior advisor. to Federal Trade Commission Chairman Lina Khan.

The court will hear the arguments of both parties on Monday. A decision is expected in June.

Legal experts say the court could find the laws unconstitutional, but could provide a road map on how to fix them. Or it could fully enforce the companies’ First Amendment rights.

Carl Szabo, the general counsel of NetChoice, which represents companies like Google and Meta and lobbies against tech regulations, said that if the group’s challenge to the laws fails, “Americans across the country should see lawful but terrible content” that are interpreted as political and are therefore subject to law.

“There are a lot of things that are worded as political content,” he said. “Terrorist recruitment may be political content.”

But if the Supreme Court rules that the laws violate the Constitution, it will entrench the status quo: Platforms, not anyone else, will determine which speech can remain online.

Adam Liptak reporting contributed.

Leave A Reply

Your email address will not be published.