Getty Images

Getty Images

Editorial: Keep a mindful eye on government use of AI chatbots

A public media report on government use of chatbots, including by Everett, calls for sound guidelines.

By The Herald Editorial Board

It’s not just the kids using artificial intelligence chatbots to finish that essay on “The Scarlet Letter” for a class assignment due the next morning.

Turns out a fair number of local government staffers and officials are using generative AI programs, such as ChatGPT and Microsoft’s Copilot for a range of tasks, including generating — in whole or in part — social media posts, press releases, policy documents, speeches, talking points, grant and other funding applications and replies to constituent emails.

A two-part investigation published this week by public radio station KNKX (94.5 FM) and Cascade PBS (Channel 9) made public records requests of city officials throughout the state, which returned thousands of pages of chat logs including prompts and the AI responses. The reporting includes examples of those records as well as interviews with officials from the cities of Everett and Bellingham. The reporting makes clear that neither city is an outlier in its use of generative AI, but that both were the most responsive to the public records request; other local governments, including Seattle, are in the process of responding to the request.

Increasingly, generative AI is being seen as a money-saving tool, especially for state and local governments looking to meet increasing demands on staff time.

A recent survey of information technology (IT) decision makers by EY, the professional services company formerly known as Ernst & Young, found that about 45 percent of state and local governments report using AI, with 39 percent using generative AI chatbots as a routine part of their work.

Chatbots, drawing from vast amount of online data, respond to prompts and simulate human conversation by predicting the most likely words to follow in a sentence. They’ve been used to generate — we won’t say “write” — prose and poetry, music, computer code, detailed reports and more than a few last-minute homework assignments.

Everett, the report noted, had use the technology for things as mundane as a lighthearted post for Instagram marking the first day of spring to more complex tasks such as gathering data and providing guidance on policy questions on topics including housing supply, gunshot detection tools and the city’s comprehensive plan update for 2044.

And that use is with the blessing of both mayors.

“I think that we all are going to have to learn to use AI,” Everett Mayor Cassie Franklin told reporter Nate Sanford. “It would be silly not to. It’s a tool that can really benefit us.”

Long facing a structural deficit and — like all local governments — limited to a 1 percent increase in property tax revenue, Everett has made numerous cuts to staff in recent years, making technology such as AI of value in continuing to meet demands for city services.

“If we don’t embrace it and use it, we will really be left behind,” Franklin said.

As valuable as that tool could be in such situations, local governments will need to take care with a technology that — in the case of OpenAI’s ChatGPT — has only been in development for three years but at this point already is on its 19th iteration, with two active versions.

And at least some city staffers, the report notes, understand that generative AI shouldn’t be left to its own devices. The technology is prone to “hallucinating,” fabricating data, figures and sources that don’t exist.

In May, the Washington Post reported that a White House report issued by the Department of Health and Human Services on children’s health made multiple errors, citing the wrong author for some studies and citing reports that didn’t exist at all; footnotes showed indications the report had been created using generative AI.

Used in updating Everett’s comprehensive plan, ChatGPT made repeated mistakes — kept out of the final draft — in analyzing the percentages of city residents spending more than 30 percent of their income for housing, the report found, to the exasperation of a city employee: “I told you factual accuracy is paramount and not to make unsubstantiated assertions or remarks, and you just did it,” the staffer said in one response. “Please remember this key instruction. Do not hallucinate.”

The ‘bot’s response after making corrections: “Thank you for catching that. I’ll ensure these totals are used going forward.”

One imagines the above sentence in the dispassionate voice of the HAL 9000 computer from “2001: A Space Odyssey.”

Everett isn’t sending city employees blindly into this brave new world. In June of 2024 it provided provisional guidelines on AI usage and is developing an update with stronger guidance, using a model developed by the GovAI Coalition, a group of cities that are collaborating on policy for the use of AI in government. One recent change made: Everett staff are now asked to use a version of Microsoft’s Copilot chatbot that was developed with local government in mind.

The new policy also is wrestling with notifying the public on when and how AI has been used in city communications and documents. The policy being drafted in Everett will require disclosure of its use for tasks involving “more than mere language refinement,” the report said.

As the use of this tool becomes more frequent, transparency regarding its use will be key to maintaining the public’s trust. Already 65 percent of the public recently told KFF that they are not confident about the accuracy of health information provided by AI chatbots. That level of confidence likely is similar beyond health issues.

Other local governments should be following Everett’s lead in developing and regularly updating their guidance on the use of generative AI and chatbots, at least as often as technology companies are updating these tools.

Simone Tarver, Everett’s communications manager, told Cascade PBS and KNKX that select staff from city departments are meeting weekly for AI training and discussion.

AI chatbots are remarkable in their ability to mimic — and mimic is what they do — analysis and reasoning and even producing complete works in mere seconds. But the human process of thinking — the thought involved in gathering information, considering it and communicating it to others — can’t be replicated by microchips in a server farm.

While the time-saving capability of AI can be seen as valuable for local governments and agencies in helping to keep costs down, officials and employees should also recognize the value — to themselves and their constituents — in taking the time to gather research on their own and commit their own thoughts to the keyboard.

Talk to us

> Give us your news tips.

> Send us a letter to the editor.

> More Herald contact information.

More in Opinion

Robotic hand playing hopscotch on a keyboard. Artifical intelligence, text generators, ai and job issues concept. Vector illustration.
Editorial: Keep a mindful eye on government use of AI chatbots

A public media report on government use of chatbots, including by Everett, calls for sound guidelines.

toon
Editorial cartoons for Thursday, Aug. 28

A sketchy look at the news of the day.… Continue reading

Snohomish City Council: Flynn’s service warrants reelection

The role of local government is not to tell us how to… Continue reading

Herald’s good journalism needs fair pay

I am a long-time Herald subscriber and reader. The Herald is a… Continue reading

Can U.S. still lead the world?

Has it occurred to you that on Jan. 20, the United States… Continue reading

Stephens: Trump’s assault on capitalism has only just begun

Coercing a stake in Intel is not only a bad deal for the country; it’s a ominous precedent.

Gov. Bob Ferguson responds to U.S. Attorney General Pam Bondi's demands that the state end so-called sanctuary policies. (Office of Governor of Washington)
Editorial: Governor’s reasoned defiance to Bondi’s ICE demands

In the face of threats, the 10th Amendment protects a state law on law enforcement cooperation.

toon
Editorial cartoons for Wednesday, Aug. 27

A sketchy look at the news of the day.… Continue reading

Burke: Why voting by mail is driving Trump crazy

Trump can read the polls, too. What they’re telling him explains why he’s going after mail-in voting.

Comment: Yes, Mr. President, slavery really was that bad

We don’t have to wallow in guilt over slavery, but neither should we ignore its great and lingering harms.

Governments need to make it easier for stores to operate

I will miss the Fred Meyer in Everett. We need to understand… Continue reading

Deli near closed bridge needs extra support to stay open

Recently, the city announced that repairs to Edgewater bridge on Mukilteo Bouelvard… Continue reading

Support local journalism

If you value local news, make a gift now to support the trusted journalism you get in The Daily Herald. Donations processed in this system are not tax deductible.