Comment: Courts can set Big Tech straight with kid safeguards

Regulations should protect all, but starting with kids’ well-being is a good place to start.

By Roger McNamee / Los Angeles Times

After more than a decade of uncontrolled experiments by internet platforms on millions of users, there is an emerging possibility that one group of users — kids — may gain some protection. A wave of court cases has an opportunity to fill a void left by the inaction of the executive and legislative branches of the federal government.

In the eight years since Russia used Facebook, Instagram and other platforms to interfere in the U.S. presidential election, Congress has done nothing to protect our democracy from assault by bad actors. It has stood by while platforms do anything that earns them a buck. It has also done nothing to protect Americans from the manipulative practices of surveillance capitalism. The White House has done only slightly more than nothing. Courts continue to side with internet platforms over the people who use them.

It should be no surprise that federal politicians favor Big Tech. Silicon Valley is where the money is. Just as important, voters have not penalized politicians for failing in their duty to protect the public interest. There has been no outcry about politicians whose family members work in Big Tech and staff members whose salaries are paid by owners of Big Tech. Politicians at the state level have passed some tech reform legislation, with California leading the way, but industry lobbying has taken the teeth out of most of the laws.

In court, internet platforms have avoided unfavorable judgments by asserting rights to free speech, as well as the protection of Section 230 of the Communications Decency Act of 1996. While there have historically been limits on First Amendment protection for harmful speech, courts have not applied any limit to the speech of internet platforms. Section 230, which was created to enable internet platforms to moderate harmful speech online, has been interpreted by courts as blanket immunity, even in cases of negligence.

Internet platforms should not be allowed to harm children (and adults) with impunity. They should not be allowed to undermine democracy and public health for profit. These notions seem obvious to everyone but those in a position to rectify the situation.

The Wall Street Journal published a report last summer titled, “Instagram Connects Vast Pedophile Network: The Meta unit’s systems for fostering communities have guided users to child-sex content.” Unredacted testimony from a federal court in California revealed that Meta employees warned Mark Zuckerberg that the design of Instagram led to addiction for many teens, only to have Zuckerberg ignore the warnings.

The common element to both stories is the indifference of Meta management to harm. The underlying cause of that indifference is the absence of consumer safety regulations for tech. Consumer safety creates friction that limits growth and profitability, something platforms avoid at all costs. Eight years of trusting platforms to self-regulate has not prevented them from being used to instigate acts of terrorism, unleash a tsunami of public health disinformation in a pandemic or enable an insurrection at the U.S. Capitol.

Fortunately, a new wave of legal cases will give courts an opportunity to change course.

The cases aim to protect children online by challenging the design of internet platforms. Thirty-three state attorneys general — led by California and Colorado — have filed a case in federal court against Meta for designing products to addict children. Nine other state attorneys general filed similar cases in their own state courts.

By focusing on product design, the cases minimize conflict with the First Amendment and Section 230. Free speech and the right to moderate speech are protected by the law, whereas product design that leads to harm and the refusal to remediate it should not be. With cases in 10 jurisdictions, the odds of a favorable outcome for the plaintiffs are better than they would be in a single jurisdiction.

In addition, there will be an appeal in federal court related to California’s Age Appropriate Design Code, a law that requires platforms to protect the privacy of minors in an age-appropriate way. Modeled on a successful consumer protection law in Britain, the California measure passed the Legislature unanimously and was signed into law in September 2022. NetChoice, a trade organization funded by Google, Meta, TikTok, Amazon and others, quickly sued to block the law.

A federal district court judge in September granted a preliminary injunction on the basis that the law probably violates the First Amendment. The flaw in the court’s reasoning is that law has nothing to do with content or expression. The decision suggests that corporations can use the First Amendment to defeat regulations designed to protect the public interest.

California Attorney General Rob Bonta has filed an appeal to challenge the injunction, arguing that we “should be able to protect our children as they use the internet. Big businesses have no right to our children’s data: childhood experiences are not for sale.” Bonta should have extended this logic to cover all Californians, but the wisdom of it in the context of children is self-evident.

By coincidence, new whistleblower disclosures have exposed reckless business practices by Meta. In testimony before a Senate committee, whistleblower Arturo Béjar confirmed that Meta’s management was fully aware of the prevalence of misogyny and unwanted sexual advances toward teenagers on Instagram and refused to take action.

Béjar’s testimony builds on that of Frances Haugen, who in 2021 provided documentary evidence that Meta’s management knew Instagram was toxic for teenage girls. Yet even after that disclosure, Meta escaped liability. It remains to be seen whether Béjar’s testimony will produce any legislative action.

The best way to ensure protection for consumers online is for Congress to pass laws that protect Americans from harmful tech products and predatory data practices. But until that happens, the courts may be our children’s only line of defense.

Roger McNamee is a co-founder of Elevation Partners and the author of “Zucked: Waking Up to the Facebook Catastrophe.”

Talk to us

> Give us your news tips.

> Send us a letter to the editor.

> More Herald contact information.

More in Opinion

toon
Editorial cartoons for Monday, Jan. 19

A sketchy look at the news of the day.… Continue reading

FILE - In this Aug. 28, 1963 file photo, the Rev. Dr. Martin Luther King Jr., head of the Southern Christian Leadership Conference, speaks to thousands during his "I Have a Dream" speech in front of the Lincoln Memorial for the March on Washington for Jobs and Freedom, in Washington. A new documentary “MLK/FBI,” shows how FBI director J. Edgar Hoover used the full force of his federal law enforcement agency to attack King and his progressive, nonviolent cause. That included wiretaps, blackmail and informers, trying to find dirt on King. (AP Photo/File)
Editorial: King would want our pledge to nonviolent action

His ‘Letter from a Birmingham Jail’ outlines his oath to nonviolence and disruptive resistance.

The Rev. Martin Luther King Jr., left, appears at a Chicago news conference with Buddhist monk Thich Nhat Hanh on May 31, 1966. AP Photo/Edward Kitch, File
Comment: In continuing service to King’s ‘beloved community’

A Buddhist monk and teacher who built a friendship with King, continued his work to realize the dream.

Forum: Continuing Dr. King’s work requires a year-round commitment

We can march and honor his legacy this weekend, but we should strive for his dream every day.

Comment: History’s warnings about those who cling to power

More than 65 years ago, a rift between civil rights leaders might have ended the movement itself.

Stephens: Iran’s leaders falling to their own antisemitism

The regime would rather pursue a perpetual jihad against Israel and Jews than feed its own people.

Lozada: Two questions podcasters, moderators should stop asking

How did we get to the point where ‘How did we get here?’ seemed a useful way to start a discussion?

A Microsoft data center campus in East Wenatchee on Nov. 3. The rural region is changing fast as electricians from around the country plug the tech industry’s new, giant data centers into its ample power supply. (Jovelle Tamayo / The New York Times)
Editorial: Meeting needs for data centers, fair power rates

Shared energy demand for AI and ratepayers requires an increased pace for clean energy projects.

Tina Ruybal prepares ballots to be moved to the extraction point in the Snohomish County Election Center on Nov. 3, 2025 in Everett, Washington. (Olivia Vanni / The Herald)
Editorial: A win for vote-by-mail, amid gathering concern

A judge preserved the state’s deadline for mailed ballots, but more challenges to voting are ahead.

Why approval of Everett Schools’ bond, levy is so important

As a former Everett School Board director, I understand public school funding… Continue reading

Welch column: Hopes for state shouldn’t be tall order

I hope that Todd Welch’s dreams for the 2026 Legislature come true… Continue reading

toon
Eitorial cartoons for Sunday, Jan. 18

A sketchy look at the news of the day.… Continue reading

Support local journalism

If you value local news, make a gift now to support the trusted journalism you get in The Daily Herald. Donations processed in this system are not tax deductible.