Comment: Musk’s abrupt silence on AI concerns is deafening

Not long ago, AI was an existential threat in the tech mogul’s mind. Does political convenience now reign?

By Parmy Olson / Bloomberg Opinion

Elon Musk has painted himself as a humanitarian figure building a utopian future through a passel of companies. Don’t fall for it. The billionaire’s silence on the sudden reversal of U.S. government guidelines for building safer artificial intelligence shows his priorities are political capital; and his own business interests.

Among the cornucopia of executive orders that President Donald Trump enacted this week was a repeal of Joseph Biden’s order on AI. Launched in October 2023, it called on major AI companies to share safety test results with the government. It was a simple list of requests. Biden’s executive order couldn’t legally force tech firms to do anything, but it was the strongest signal so far that the U.S. government was serious about the safety and oversight of AI systems.

Trump did say on the campaign trail that he would revoke the order, following grumbling from members of the Republican party that it stifled innovation. But Musk, now serving as an adviser with a White House role and direct access to Trump, has remained conspicuously silent on an action he might once have forcefully opposed.

In March 2023, he signed an open letter calling for a six-month pause on advanced AI, warning it posed “profound risks to society and humanity.” A few months later he told BBC News that AI could cause “civilization destruction,” comparing it with nuclear weapons. Musk ended his longtime friendship with Google co-founder Larry Page over an argument about AI risk, according to Walter Isaacson’s biography, and he co-founded OpenAI over concerns that Google wasn’t paying enough attention to the technology’s existential threat to humanity.

“I think we need to regulate AI safety,” Musk said in 2023. “It is, I think, actually a bigger risk to society than cars or planes or medicine.”

If Musk truly believed that, he’d be advising the new president to maintain the system already in place, which wasn’t that onerous to begin with. So far, the largest AI labs have voluntarily cooperated with AI safety institutes both in the U.S., based in the National Institute of Standards and Technology (NIST) in Maryland, and in the United Kingdom. Biden’s order hadn’t set hard standards so much as guidance for reporting and transparency on the part of tech firms.

That’s sorely needed at a time when, thanks to the opaque nature of the largest AI labs, we know more about the ingredients in a packet of Doritos than we do about a generative AI model that banks and legal firms are plugging into their systems.

Yet the stakes of AI development have only grown bigger, with OpenAI and partners including Softbank Group Corp. and Oracle Corp. now planning a $500 billion infrastructure investment that would dramatically accelerate AI development; exactly the kind of rapid scaling that Musk once warned could be catastrophic. Yet on this, too, the former doomsayer remains quiet.

Such selective silence is hardly surprising from someone who launched Tesla Inc. to combat climate change but now aligns with anti-electric vehicle politicians, or who claims to champion free speech while kicking journalists off his platform and suing his critics.

Musk’s principles seem to be as erratic as his tweets and, right now, being Trump’s new best friend seems to outweigh being humanity’s self-appointed sentinel.

Musk’s warnings on AI weren’t necessarily right. There are more near-term concerns about the security and fairness of AI models, and their impact on the job market. But his current hush speaks volumes about how a billionaire’s apocalyptic concerns can be set aside for political convenience.

Perhaps we should expect to see less agitating from Musk on AI standards, and for him to spend more time and energy unblocking policies that could impede his companies from getting ahead in the AI race, including Space Exploration Technologies Corp., Tesla and X.AI Corp. If the man who once called AI humanity’s greatest existential threat won’t speak up to defend basic safety measures, it’s worth asking what other principles of his might crumble in the face of power and access.

Parmy Olson is a Bloomberg Opinion columnist covering technology. A former reporter for the Wall Street Journal and Forbes, she is author of “Supremacy: AI, ChatGPT and the Race That Will Change the World.”

Talk to us

> Give us your news tips.

> Send us a letter to the editor.

> More Herald contact information.

More in Opinion

People walk adjacent to the border with Canada at the Peace Arch in Peace Arch Historical State Park, where cars behind wait to enter Canada at the border crossing Monday, Aug. 9, 2021, in Blaine, Wash. Canada lifted its prohibition on Americans crossing the border to shop, vacation or visit, but America kept similar restrictions in place, part of a bumpy return to normalcy from coronavirus travel bans. (AP Photo/Elaine Thompson)
Editorial: U.S. and Canada better neighbors than housemates

President Trump may be serious about annexing Canada, but it’s a deal fraught with complexities for all.

Schwab: If you’re OK with foreign aid cuts, guess who’s next

At some point, if they haven’t already, Trump’s and Musk’s cuts will hit all but a very elite few.

Poor planning behind Snohomish PUD rate increase

It did not take long in 2025 for the Snohomish Public Utility… Continue reading

Trump’s aid cut will cost U.S. influence

The last time the U.S. pulled back its aid to other countries,… Continue reading

Musk’s financial access is a threat to personal data, payments

Recently Elon Musk along with six young men between 19-24 were able… Continue reading

CNA Nina Prigodich, right, goes through restorative exercises with long term care patient Betty Long, 86, at Nightingale's View Ridge Care Center on Friday, Feb. 10, 2023 in Everett, Washington. (Olivia Vanni / The Herald)
Editorial: Boost state Medicaid funding for long-term care

With more in need of skilled nursing and assisted-living services, funding must keep up to retain staff.

bar graph, pie chart and diagrams isolated on white, 3d illustration
Editorial: Don’t let state’s budget numbers intimidate you

With budget discussions starting soon, a new website explains the basics of state’s budget crisis.

Curtains act as doors for a handful of classrooms at Glenwood Elementary on Monday, Sept. 9, 2024 in Lake Stevens, Washington. (Olivia Vanni / The Herald)
Editorial: Schools’ building needs point to election reform

Construction funding requests in Arlington and Lake Stevens show need for a change to bond elections.

Comment: Birthright citizenship has helped make America great

Trump’s attempt to end it, almost certainly unconstitutional, won’t fix the nation’s problems at its borders.

toon
Editorial cartoons for Thursday, Feb. 13

A sketchy look at the news of the day.… Continue reading

State single-payer health care bill offers many advantages

I was excited to read in Will Geschke’s report (“Everett lawmakers back… Continue reading

Support local journalism

If you value local news, make a gift now to support the trusted journalism you get in The Daily Herald. Donations processed in this system are not tax deductible.