Comment: Musk’s abrupt silence on AI concerns is deafening

Not long ago, AI was an existential threat in the tech mogul’s mind. Does political convenience now reign?

By Parmy Olson / Bloomberg Opinion

Elon Musk has painted himself as a humanitarian figure building a utopian future through a passel of companies. Don’t fall for it. The billionaire’s silence on the sudden reversal of U.S. government guidelines for building safer artificial intelligence shows his priorities are political capital; and his own business interests.

Among the cornucopia of executive orders that President Donald Trump enacted this week was a repeal of Joseph Biden’s order on AI. Launched in October 2023, it called on major AI companies to share safety test results with the government. It was a simple list of requests. Biden’s executive order couldn’t legally force tech firms to do anything, but it was the strongest signal so far that the U.S. government was serious about the safety and oversight of AI systems.

Trump did say on the campaign trail that he would revoke the order, following grumbling from members of the Republican party that it stifled innovation. But Musk, now serving as an adviser with a White House role and direct access to Trump, has remained conspicuously silent on an action he might once have forcefully opposed.

In March 2023, he signed an open letter calling for a six-month pause on advanced AI, warning it posed “profound risks to society and humanity.” A few months later he told BBC News that AI could cause “civilization destruction,” comparing it with nuclear weapons. Musk ended his longtime friendship with Google co-founder Larry Page over an argument about AI risk, according to Walter Isaacson’s biography, and he co-founded OpenAI over concerns that Google wasn’t paying enough attention to the technology’s existential threat to humanity.

“I think we need to regulate AI safety,” Musk said in 2023. “It is, I think, actually a bigger risk to society than cars or planes or medicine.”

If Musk truly believed that, he’d be advising the new president to maintain the system already in place, which wasn’t that onerous to begin with. So far, the largest AI labs have voluntarily cooperated with AI safety institutes both in the U.S., based in the National Institute of Standards and Technology (NIST) in Maryland, and in the United Kingdom. Biden’s order hadn’t set hard standards so much as guidance for reporting and transparency on the part of tech firms.

That’s sorely needed at a time when, thanks to the opaque nature of the largest AI labs, we know more about the ingredients in a packet of Doritos than we do about a generative AI model that banks and legal firms are plugging into their systems.

Yet the stakes of AI development have only grown bigger, with OpenAI and partners including Softbank Group Corp. and Oracle Corp. now planning a $500 billion infrastructure investment that would dramatically accelerate AI development; exactly the kind of rapid scaling that Musk once warned could be catastrophic. Yet on this, too, the former doomsayer remains quiet.

Such selective silence is hardly surprising from someone who launched Tesla Inc. to combat climate change but now aligns with anti-electric vehicle politicians, or who claims to champion free speech while kicking journalists off his platform and suing his critics.

Musk’s principles seem to be as erratic as his tweets and, right now, being Trump’s new best friend seems to outweigh being humanity’s self-appointed sentinel.

Musk’s warnings on AI weren’t necessarily right. There are more near-term concerns about the security and fairness of AI models, and their impact on the job market. But his current hush speaks volumes about how a billionaire’s apocalyptic concerns can be set aside for political convenience.

Perhaps we should expect to see less agitating from Musk on AI standards, and for him to spend more time and energy unblocking policies that could impede his companies from getting ahead in the AI race, including Space Exploration Technologies Corp., Tesla and X.AI Corp. If the man who once called AI humanity’s greatest existential threat won’t speak up to defend basic safety measures, it’s worth asking what other principles of his might crumble in the face of power and access.

Parmy Olson is a Bloomberg Opinion columnist covering technology. A former reporter for the Wall Street Journal and Forbes, she is author of “Supremacy: AI, ChatGPT and the Race That Will Change the World.”

Talk to us

> Give us your news tips.

> Send us a letter to the editor.

> More Herald contact information.

More in Opinion

A Sabey Corporation data center in East Wenatchee, Wash., on Nov. 3, 2024. The rural region is changing fast as electricians from around the country plug the tech industry’s new, giant data centers into its ample power supply. (Jovelle Tamayo/The New York Times)
Editorial: Protect utililty ratepayers as data centers ramp up

State lawmakers should move ahead with guardrails for electricity and water use by the ‘cloud’ and AI.

toon
Editorial cartoons for Saturday, Feb. 7

A sketchy look at the news of the day.… Continue reading

Comment: Listening to, helping boys and men can help us all

State lawmakers can establish a state Boys and Men Commission to address the challenges they face.

Comment: LifeWise misreads Constitution in suing Everett Schools

Case law allows release time for off-campus religious instruction. Schools don’t have to promote it.

Comment: Without child care support, work stops; it’s simple

Families and employers depend on state child care assistance. Cuts to two programs would harm all.

Forum: Immigration raids involving children cause lasting trauma

The cruelty and terror inherent in raids by federal immigration agents cannot be allowednear children.

Forum: As go our forests, so goes our environmental future

The Trump administration’s move to end the Roadless Rule jeopardizes ancient forests and risks collapse.

Advocates for people with intellectual and developmental disabilities rallied on the state capitol steps on Jan. 17. The group asked for rate increases for support staff and more funding for affordable housing. (Laurel Demkovich/Washington State Standard)
Editorial: Limit redundant reviews of those providing care

If lawmakers can’t boost funding for supported living, they can cut red tape that costs time.

toon
Editorial cartoons for Friday, Feb. 6

A sketchy look at the news of the day.… Continue reading

The Buzz: ‘Smile, Darn Ya, Smile’ when addressing the president

Reporters must remember to grin when asking President Trump about Epstein’s sexual assault victims.

Schwab: When you’re the president, they let you do anything

While Trump grifts for billions in his first year, Stephen Miller rethinks the non-rights of laborers.

Bill for cardiac response plans at schools can save lives of children

Recently, I visited Olympia to testify in front of the Senate Committee… Continue reading

Support local journalism

If you value local news, make a gift now to support the trusted journalism you get in The Daily Herald. Donations processed in this system are not tax deductible.