By The Herald Editorial Board
Many parents tonight will exercise their “parental controls” over the candy and treats that their kids will bring home after a night of Trick-or-Treating. In most homes, after an inspection — and the grown-ups’ candy tax — the “loot” will go in a cupboard or drawer to be doled out as parents see fit. At least that’s the intention.
But parents may be less familiar with how to exercise their parental control over Facebook, Instagram, TikTok and other social media platforms, and are largely unaware of the ways that those social media companies can thwart their efforts to keep an eye on their kids’ social media use. Facebook, for example, doesn’t make it simple to set limits on privacy and what kids can see. There’s no clear settings menu on Facebook’s main page; you first have to select menu, then scroll down to “Settings & privacy,” and look for “Privacy checkup” under the Settings menu, with selections on various features, none of them aimed specifically at helping parents set controls. Nor is there anything that locks out a tech-savvy child from resetting what a parent might have changed.
It’s no surprise then, that a bipartisan brigade of attorneys general from some 40 states, including Washington’s Bob Ferguson, has filed suits against Meta — the parent of Facebook and Instagram — alleging the tech behemoth has knowingly included features that are harmful and addictive to children, seeking to keep kids scrolling and advertising revenue flowing.
The warnings of the potential for harm and addictive use among children have grown in recent months and years.
In May, U.S. Surgeon General Vivek Murthy issued a 19-page advisory that warned the effects of social media on adolescent mental health were not fully understood and that lack of understanding was occurring at a time of growing concern for the mental health, physical well-being and social development of kids and teens.
The concern in the report was two-fold, regarding the amount of time children are using social media; and the messages they are viewing and interacting with. The lawsuits call out both.
And in 2021, a former Facebook employee, Frances Haugen, leaked company research by Meta that showed it knew that Instagram and Facebook posed mental health risks to youths.
“The choices being made inside of Facebook are disastrous for our children, for our public safety, for privacy and for our democracy. And that is why we must demand Facebook changes,” Haugen said during testimony before Congress.
The 233-page legal complaint argues that Meta’s conduct “constitutes unfair or deceptive acts or practices” that violate state and federal laws regarding consumer rights and privacy, as well as specific laws regarding children and social media.
Along with financial penalties, the suit also seeks to block Meta from using features that can harm children and youths, such as “infinite scrolling,” “likes” and notifications using sound and vibrations that studies have shown condition children and adults to continued use and which original developers have compared to “behavioral cocaine.”
Those features have proved to work well to keep kids — and adults — engaged, obsessively so.
The presence of social media in the lives of U.S. teens is nearly universal; about 95 percent say they use social media platforms — most frequently TikTok, Instagram and Snapchat — with two-thirds using them for an average of three hours each day and 1 in 5 using it “almost constantly,” according to the Pew Research Center, leading to a lack of sufficient sleep and problems with attention to studies and tasks.
The Pew research found that children and teens also reported viewing “extreme, inappropriate content” and predatory contacts; feelings of dissatisfaction with their own bodies, especially among girls; and an addiction-like attraction to frequent use.
Among the remedies sought in the suit, cited by Ferguson’s office, include: altering or eliminating the “like” button; restricting the frequency of notifications and how they’re delivered; eliminating the “infinite scroll” feature; strengthening age limits and creating a separate type of account for those under a certain age that would block features most harmful to kids.
If the “like” button appears innocuous, consider that it offers the reward of others’ validation that keeps users checking on past posts and encourages more posts; social media platform algorithms then use those “likes” to direct users to advertising and other content.
Meta said it was “disappointed” that the states turned to a lawsuit rather than work with it on age-appropriate standards. But Meta has more than avoided making changes that would benefit children, even suing states when they’ve attempted to adopt laws setting standards to protect kids. An online child safety bill passed by California last year was challenged last month in court, with a federal judge preliminarily blocking the law for potential violations of the First Amendment.
Though not part of the lawsuit, much of the concern here must also apply to the wider industry, including TikTok and X, formerly known as Twitter. In China, where the TikTok platform is known as Douyin, the social media site limits children to 40 minutes of screen time a day.
The intention in the lawsuit isn’t to wall kids off from social media, but to set limits that allow them to use it safely and to their benefit.
The report from the surgeon general this spring cited social media’s ability to provide positive community and connection for youths with peers who share interests, abilities and identities, especially for those who are often marginalized as racial, ethnic, sexual and gender minorities.
Whether by ruling or settlement, policies and protections must be provided that allow users — especially parents — to set reasonable protections and limits for themselves and children.
A treat without the tricks.
Talk to us
> Give us your news tips.
> Send us a letter to the editor.
> More Herald contact information.