Comment: Courts can set Big Tech straight with kid safeguards

Regulations should protect all, but starting with kids’ well-being is a good place to start.

By Roger McNamee / Los Angeles Times

After more than a decade of uncontrolled experiments by internet platforms on millions of users, there is an emerging possibility that one group of users — kids — may gain some protection. A wave of court cases has an opportunity to fill a void left by the inaction of the executive and legislative branches of the federal government.

In the eight years since Russia used Facebook, Instagram and other platforms to interfere in the U.S. presidential election, Congress has done nothing to protect our democracy from assault by bad actors. It has stood by while platforms do anything that earns them a buck. It has also done nothing to protect Americans from the manipulative practices of surveillance capitalism. The White House has done only slightly more than nothing. Courts continue to side with internet platforms over the people who use them.

It should be no surprise that federal politicians favor Big Tech. Silicon Valley is where the money is. Just as important, voters have not penalized politicians for failing in their duty to protect the public interest. There has been no outcry about politicians whose family members work in Big Tech and staff members whose salaries are paid by owners of Big Tech. Politicians at the state level have passed some tech reform legislation, with California leading the way, but industry lobbying has taken the teeth out of most of the laws.

In court, internet platforms have avoided unfavorable judgments by asserting rights to free speech, as well as the protection of Section 230 of the Communications Decency Act of 1996. While there have historically been limits on First Amendment protection for harmful speech, courts have not applied any limit to the speech of internet platforms. Section 230, which was created to enable internet platforms to moderate harmful speech online, has been interpreted by courts as blanket immunity, even in cases of negligence.

Internet platforms should not be allowed to harm children (and adults) with impunity. They should not be allowed to undermine democracy and public health for profit. These notions seem obvious to everyone but those in a position to rectify the situation.

The Wall Street Journal published a report last summer titled, “Instagram Connects Vast Pedophile Network: The Meta unit’s systems for fostering communities have guided users to child-sex content.” Unredacted testimony from a federal court in California revealed that Meta employees warned Mark Zuckerberg that the design of Instagram led to addiction for many teens, only to have Zuckerberg ignore the warnings.

The common element to both stories is the indifference of Meta management to harm. The underlying cause of that indifference is the absence of consumer safety regulations for tech. Consumer safety creates friction that limits growth and profitability, something platforms avoid at all costs. Eight years of trusting platforms to self-regulate has not prevented them from being used to instigate acts of terrorism, unleash a tsunami of public health disinformation in a pandemic or enable an insurrection at the U.S. Capitol.

Fortunately, a new wave of legal cases will give courts an opportunity to change course.

The cases aim to protect children online by challenging the design of internet platforms. Thirty-three state attorneys general — led by California and Colorado — have filed a case in federal court against Meta for designing products to addict children. Nine other state attorneys general filed similar cases in their own state courts.

By focusing on product design, the cases minimize conflict with the First Amendment and Section 230. Free speech and the right to moderate speech are protected by the law, whereas product design that leads to harm and the refusal to remediate it should not be. With cases in 10 jurisdictions, the odds of a favorable outcome for the plaintiffs are better than they would be in a single jurisdiction.

In addition, there will be an appeal in federal court related to California’s Age Appropriate Design Code, a law that requires platforms to protect the privacy of minors in an age-appropriate way. Modeled on a successful consumer protection law in Britain, the California measure passed the Legislature unanimously and was signed into law in September 2022. NetChoice, a trade organization funded by Google, Meta, TikTok, Amazon and others, quickly sued to block the law.

A federal district court judge in September granted a preliminary injunction on the basis that the law probably violates the First Amendment. The flaw in the court’s reasoning is that law has nothing to do with content or expression. The decision suggests that corporations can use the First Amendment to defeat regulations designed to protect the public interest.

California Attorney General Rob Bonta has filed an appeal to challenge the injunction, arguing that we “should be able to protect our children as they use the internet. Big businesses have no right to our children’s data: childhood experiences are not for sale.” Bonta should have extended this logic to cover all Californians, but the wisdom of it in the context of children is self-evident.

By coincidence, new whistleblower disclosures have exposed reckless business practices by Meta. In testimony before a Senate committee, whistleblower Arturo Béjar confirmed that Meta’s management was fully aware of the prevalence of misogyny and unwanted sexual advances toward teenagers on Instagram and refused to take action.

Béjar’s testimony builds on that of Frances Haugen, who in 2021 provided documentary evidence that Meta’s management knew Instagram was toxic for teenage girls. Yet even after that disclosure, Meta escaped liability. It remains to be seen whether Béjar’s testimony will produce any legislative action.

The best way to ensure protection for consumers online is for Congress to pass laws that protect Americans from harmful tech products and predatory data practices. But until that happens, the courts may be our children’s only line of defense.

Roger McNamee is a co-founder of Elevation Partners and the author of “Zucked: Waking Up to the Facebook Catastrophe.”

Talk to us

> Give us your news tips.

> Send us a letter to the editor.

> More Herald contact information.

More in Opinion

toon
Editorial cartoons for Wednesday, Sept. 17

A sketchy look at the news of the day.… Continue reading

2024 Presidential Election Day Symbolic Elements.
Editorial: Garrard best for Edmonds School Board post

The retired teacher was appointed last year to fill a vacancy and has contributed from the start.

Welch: State’s climate act hides cost to gas and its spending

The CCA is at least partly to blame for the highest gas price in the U.S. Is it delivering cleaner air?

Harrop: Did Charlie Kirk’s assassin do it to show he could?

By Froma Harrop / Creators.com This is not about Charlie Kirk. He… Continue reading

Comment: Would it be better if we didn’t blur the carnage?

Would we understand the brutality of lax gun laws if images of death weren’t concealed from view?

Comment: Putin has given Trump, Europe excuse to add pressure

Putin’s challenge of NATO has prompted talk of tougher sanctions. Now Trump and Europe have to commit.

Saunders: The ‘bad actors’ virtue-signaling opposition to Israel

Film workers signed a petition in support of Palestinians but ignore who is responsible for the war.

young handsome man in grey sweater sitting on chair isolated on white
Editorial cartoons for Tuesday, Sept. 16

A sketchy look at the news of the day.… Continue reading

Comment: Trump’s crackdown on drug ads good start; more needed

Rolling back rules to earlier standards is good, but the FDA may not have the staffing to enforce it.

Friedman: Peace that Trump should seek is in his own land

It’s in his power to call for his political allies and opponents to stand together and speak against violence.

French: Our partisan blindness divides us into warring factions

If you believe the other side is ‘the problem,’ the temptation toward punitive authoritarianism is overwhelming.

Harrop: Murder can’t be erased; why lighten its sentences?

Yes, mental illness by those convicted of violence should be treated. But should release follow?

Support local journalism

If you value local news, make a gift now to support the trusted journalism you get in The Daily Herald. Donations processed in this system are not tax deductible.