The news is by your side.

Silicon Valley is battling states over new online safety laws for children

0

Last summer, Ohio passed a social media statute that requires Instagram, Snapchat, TikTok and YouTube to obtain parental consent before allowing children under 16 to use their platforms.

But this month, just before the measure was set to take effect, a tech industry group called NetChoice — which represents Google, Meta, Snap, TikTok and others — filed a lawsuit to block it on freedom of speech grounds, and convinced a Federal District Court judge to temporarily halt the new rules.

The case is part of a wide-ranging litigation campaign by NetChoice to block new state laws protecting young people online — an anti-regulation effort likely to come under scrutiny Wednesday as the Senate Judiciary Committee requests social media executives about child sexual exploitation online. The NetChoice lawsuits have roiled state officials and lawmakers who sought input from tech companies in crafting the new measures.

“I think it's cowardly and disingenuous,” Jon Husted, Ohio's lieutenant governor, said of the industry lawsuit, noting that he or his staff had met with Google and Meta last year about the bill and thus companies' concerns had been addressed. “We tried to be as cooperative as possible – and then at the eleventh hour they filed a lawsuit.”

Social media platforms said some state laws contradict each other and that they would prefer Congress to pass a federal law setting national standards for children's online safety.

NetChoice said the new state laws infringe on its members' First Amendment rights to freely disseminate information, as well as the rights of minors to obtain information.

“There's a reason why this is such a big win for NetChoice every time,” said Carl Szabo, the group's vice president. “And that's because it's so clearly unconstitutional.”

Fueled by escalating public concerns about young people's mental health, lawmakers and regulators in the United States are stepping up bipartisan efforts to rein in popular social media platforms by enacting a wave of legislation, even as tech industry groups working to undo this.

A first-of-its-kind law passed in Utah last spring would require social media companies to verify users' ages and obtain parental consent before allowing minors to create accounts. Arkansas, Ohio, Louisiana and Texas subsequently passed similar laws requiring parental consent for social media services.

A groundbreaking new California law, the Age-Appropriate Design Code Act, would require many popular social media apps and multiplayer video games to enable the highest privacy settings — and disable potentially risky features like messaging systems that allow adult strangers to contact young people — default for minors.

“The intent is to ensure that all tech products that anyone under the age of 18 has access to are safe for children by design and standard,” said Buffy Wicks, member of the California Assembly and co-sponsor of the bill.

But free speech lawsuits by NetChoice have dealt a major blow to these state efforts.

In California and Arkansas, judges in the NetChoice cases last year temporarily blocked the new state laws from taking effect. (The New York Times and the Student Press Law Center have filed a joint filing friend-of-the-court note last year in the California case in support of NetChoice, arguing that the law could limit the newsworthy content available to students.)

“There has been a lot of pressure on states to regulate social media and protect against its harms, and much of the fear is now being channeled into laws that specifically affect children,” said Genevieve Lakier, a professor at the University of Chicago School of Law. “What you see here is that the First Amendment is still a concern, that these laws have been stopped in many cases.”

State lawmakers and officials said they viewed the tech industry's setback as a temporary setback, describing their new laws as reasonable measures to ensure basic safety for children online. Rob Bonta, California's attorney general, said the state's new law would regulate the design of platforms and the behavior of companies — not their content. The California statute, which takes effect in July, does not explicitly require social media companies to verify the age of each user.

Mr. Bonta recently appealed the ruling that halted the law.

“NetChoice has one Burn-it-all strategy, and they're going to challenge every law and regulation that protects children and their privacy in the name of the First Amendment,” he said in a telephone interview on Sunday.

California was introduced on Monday the online privacy and security accounts of two children which Mr. Bonta sponsored.

NetChoice has also filed a lawsuit to try to block Utah's new social media law, which requires Instagram and TikTok to verify users' ages and obtain parental consent for minors to have accounts.

Civil rights groups have warned that such legislative efforts could undermine freedom of expression — by requiring adults, as well as minors, to verify their age using documents such as driver's licenses just to set up and run social media accounts. to use. Requiring parental consent for social media, they say, could also prevent young people from finding support groups or important resources about reproductive health or gender identity.

The Supreme Court has struck down a number of laws aimed at protecting minors from potentially harmful content, including violent video games and 'indecent' online material, on freedom of expression grounds.

Social media companies said they had put in place many protections for young people and that they would prefer to see Congress pass federal legislation rather than require companies to comply with a patchwork of sometimes conflicting state laws.

Snap recently became the first social media company to support a federal bill called the Kids Online Safety Act, which has some similarities to California's new law.

In a statement, Snap said many of the provisions in the federal bill are mirrored the company's existing safeguards, such as setting teen accounts to the strictest privacy settings by default. The statement added that the bill would direct government agencies to study technological approaches to age verification.

Googling And TikTok declined to comment.

Meta has called for Congress to pass legislation that would make Apple and Google's app stores — and not social media companies — responsible for verifying a user's age and obtaining parental consent before allowing anyone under 16 to download an app. Meta recently started running ads on Instagram and said it supported the federal law.

“We are in favor of clear, consistent legislation that makes it simpler parents to help manage their teens' online experiences, and that means all apps teens use meet the same standard,” Meta said in a statement. “We want to continue working with policymakers to find more workable solutions.”

But merely requiring parental consent would do nothing to mitigate the potentially harmful effects of social media platforms, the federal judge in Ohio's NetChoice case has noted.

“Excluding minors under the age of 16 from access to all content” on social media websites “is a breathtakingly blunt tool for reducing the harms of social media to children,” said Judge Algenon L. Marbley, Chief Judge of the U.S. District Court for the Southern District of Ohio, Eastern Division, wrote in his ruling temporarily halting the state's social media law.

Leave A Reply

Your email address will not be published.