The news is by your side.

If your child is addicted to TikTok, this could be the solution

0

In recent years, hundreds of families and school districts across the country have sued major tech companies because the hypnotic properties of social media, popular with children, have made too many of them unwell. Citing the promotion of the “corpse bride” diet and other practices surrounding dangerous forms of weight loss, Seattle Public Schools concludes filed a complaint in January, arguing that platforms like TikTok and Snapchat have been so relentless in providing harmful material to students that the resulting anxiety, depression and suicidal ideation have interfered with the school system’s primary mission of educating children.

More recently, in October, New York Attorney General Letitia James, along with top prosecutors from more than 30 other states, filed suit against Meta, alleging that the company placed features on Instagram and Facebook to intentionally target children. addicted for profit.

Technology companies, which claim First Amendment protection, have tried to get these types of lawsuits quickly dismissed. But on Tuesday, a federal judge in California ruled a statement to make that more difficult. In it, she argued that what most concerned plaintiffs — the ineffective parental controls, the challenges of deleting accounts, the lack of age verification, and the timing and clustering of notifications to drive habitual usage — was not the equivalent of speech, so the legal cases that fall under its review must be allowed to proceed.

Forty years ago, drunk driving was an epidemic, claiming the lives of young people, a seemingly uncontrollable problem until a group of mothers committed to enforcing laws that brought accountability. It was a pivotal moment in the modern history of public health, and similarly, 2023 will likely be remembered as a turning point in the social media health crisis.

In May, Surgeon General Vivek Murthey arrived advisory – a “call for urgent action” – to develop policies around a practice that eroded adolescents’ sociality and self-esteem, compromising sleep and healthy body image. Both state and federal legislatures have attempted to enact laws that would keep certain types of emotionally disruptive content out of view.

If nothing else, these efforts have emerged as a space of detente in our otherwise perpetual culture wars; TikTok seems to be fueling adult ire wherever you stand on gender-neutral bathrooms or Ban on ‘Anti-Racist Baby’. A Senate account introduced in the spring – the Social Media Protection of Children Act – which required companies to verify the age of their users, was sponsored by unlikely comrades, Connecticut Democrat Chris Murphy and Arkansas Republican Tom Cotton.

The problem with some of the proposed legislation is its emphasis on prohibition, which leaves interpretations of harm to the discretion of judges and regulators and in turn creates an open door for endless litigation. Montana provides the clearest case. In May, the governor signed a law banning TikTok entirely, promising to impose corporate fines if the app is found to be operating in the state. Both the platform’s parent company, ByteDance, based in China, and TikTok users themselves immediately filed a lawsuit, claiming the law was unconstitutional.

New York has chosen to follow a different path. State lawmakers, hoping to circumvent some of these obstacles and serve as a model for the rest of the country, have committed to emphasizing distribution over content, and technical operation over issues of expression. Sponsored by Andrew Gounardes, a senator from Brooklyn, two accounts aim to make some changes. First, they would require social media companies to limit the use of predictive algorithmic features designed to keep users on a given platform longer; second, they would allow parents to block access to social media sites between midnight and 6 a.m

The legislation comes with the very vocal support of Governor Kathy Hochul and Ms. James. “We want all these social media apps to show kids only the content they want to see,” Mr. Gounardes told me. “If a parent decides otherwise, they can turn on the algorithm. But the default is that it must be disabled.”

Zephyr Teachout, the lawyer who helped draft the legislation, saw a precedent in the way gambling is regulated. The algorithmic targeting is similar to the kind deployed by slot machines, which again and again deliver the enticing array of oranges and cherries that keep you pulling the lever, with the elusive jackpot in mind. Any form of online gambling essentially involves, as Ms. Teachout noted, “the algorithmically determined type of content to be delivered, and most states prohibit gambling for persons under the age of 18.”

If the law were to come under Supreme Court scrutiny, a 2011 case that struck down a California law banning the sale or rental of violent video games to minors would likely emerge as a point of reference. In that case, even justices who agreed with the majority opinion pointed out that technology was changing at a rapid pace, and that different circumstances might require a more nuanced approach later. “They have put up a marker that is very relevant to this moment,” Ms Teachout said. “They said the court should not simply apply old standards to new and rapidly evolving forms of digital media.”

New York law, in the view of its creators, is so narrowly constructed that courts should recognize it as a critical response to a pervasive problem in which we all have a special responsibility. Facebook would work like it did in its early version, when what you received in your feed was only what you signed up for. No one would be prevented from seeking out whatever he wanted. “It’s just that you can’t open a Taylor Swift page and five clicks later see a video showing you how to hurt yourself,” Mr. Gounardes said.

Leave A Reply

Your email address will not be published.