Getty Images

Getty Images

Editorial: Keep a mindful eye on government use of AI chatbots

A public media report on government use of chatbots, including by Everett, calls for sound guidelines.

By The Herald Editorial Board

It’s not just the kids using artificial intelligence chatbots to finish that essay on “The Scarlet Letter” for a class assignment due the next morning.

Turns out a fair number of local government staffers and officials are using generative AI programs, such as ChatGPT and Microsoft’s Copilot for a range of tasks, including generating — in whole or in part — social media posts, press releases, policy documents, speeches, talking points, grant and other funding applications and replies to constituent emails.

A two-part investigation published this week by public radio station KNKX (94.5 FM) and Cascade PBS (Channel 9) made public records requests of city officials throughout the state, which returned thousands of pages of chat logs including prompts and the AI responses. The reporting includes examples of those records as well as interviews with officials from the cities of Everett and Bellingham. The reporting makes clear that neither city is an outlier in its use of generative AI, but that both were the most responsive to the public records request; other local governments, including Seattle, are in the process of responding to the request.

Increasingly, generative AI is being seen as a money-saving tool, especially for state and local governments looking to meet increasing demands on staff time.

A recent survey of information technology (IT) decision makers by EY, the professional services company formerly known as Ernst & Young, found that about 45 percent of state and local governments report using AI, with 39 percent using generative AI chatbots as a routine part of their work.

Chatbots, drawing from vast amount of online data, respond to prompts and simulate human conversation by predicting the most likely words to follow in a sentence. They’ve been used to generate — we won’t say “write” — prose and poetry, music, computer code, detailed reports and more than a few last-minute homework assignments.

Everett, the report noted, had use the technology for things as mundane as a lighthearted post for Instagram marking the first day of spring to more complex tasks such as gathering data and providing guidance on policy questions on topics including housing supply, gunshot detection tools and the city’s comprehensive plan update for 2044.

And that use is with the blessing of both mayors.

“I think that we all are going to have to learn to use AI,” Everett Mayor Cassie Franklin told reporter Nate Sanford. “It would be silly not to. It’s a tool that can really benefit us.”

Long facing a structural deficit and — like all local governments — limited to a 1 percent increase in property tax revenue, Everett has made numerous cuts to staff in recent years, making technology such as AI of value in continuing to meet demands for city services.

“If we don’t embrace it and use it, we will really be left behind,” Franklin said.

As valuable as that tool could be in such situations, local governments will need to take care with a technology that — in the case of OpenAI’s ChatGPT — has only been in development for three years but at this point already is on its 19th iteration, with two active versions.

And at least some city staffers, the report notes, understand that generative AI shouldn’t be left to its own devices. The technology is prone to “hallucinating,” fabricating data, figures and sources that don’t exist.

In May, the Washington Post reported that a White House report issued by the Department of Health and Human Services on children’s health made multiple errors, citing the wrong author for some studies and citing reports that didn’t exist at all; footnotes showed indications the report had been created using generative AI.

Used in updating Everett’s comprehensive plan, ChatGPT made repeated mistakes — kept out of the final draft — in analyzing the percentages of city residents spending more than 30 percent of their income for housing, the report found, to the exasperation of a city employee: “I told you factual accuracy is paramount and not to make unsubstantiated assertions or remarks, and you just did it,” the staffer said in one response. “Please remember this key instruction. Do not hallucinate.”

The ‘bot’s response after making corrections: “Thank you for catching that. I’ll ensure these totals are used going forward.”

One imagines the above sentence in the dispassionate voice of the HAL 9000 computer from “2001: A Space Odyssey.”

Everett isn’t sending city employees blindly into this brave new world. In June of 2024 it provided provisional guidelines on AI usage and is developing an update with stronger guidance, using a model developed by the GovAI Coalition, a group of cities that are collaborating on policy for the use of AI in government. One recent change made: Everett staff are now asked to use a version of Microsoft’s Copilot chatbot that was developed with local government in mind.

The new policy also is wrestling with notifying the public on when and how AI has been used in city communications and documents. The policy being drafted in Everett will require disclosure of its use for tasks involving “more than mere language refinement,” the report said.

As the use of this tool becomes more frequent, transparency regarding its use will be key to maintaining the public’s trust. Already 65 percent of the public recently told KFF that they are not confident about the accuracy of health information provided by AI chatbots. That level of confidence likely is similar beyond health issues.

Other local governments should be following Everett’s lead in developing and regularly updating their guidance on the use of generative AI and chatbots, at least as often as technology companies are updating these tools.

Simone Tarver, Everett’s communications manager, told Cascade PBS and KNKX that select staff from city departments are meeting weekly for AI training and discussion.

AI chatbots are remarkable in their ability to mimic — and mimic is what they do — analysis and reasoning and even producing complete works in mere seconds. But the human process of thinking — the thought involved in gathering information, considering it and communicating it to others — can’t be replicated by microchips in a server farm.

While the time-saving capability of AI can be seen as valuable for local governments and agencies in helping to keep costs down, officials and employees should also recognize the value — to themselves and their constituents — in taking the time to gather research on their own and commit their own thoughts to the keyboard.

Talk to us

> Give us your news tips.

> Send us a letter to the editor.

> More Herald contact information.

More in Opinion

FILE — Health and Human Services Secretary Robert F. Kennedy Jr. speaks alongside President Donald Trump during an event announcing a drug pricing deal with Pfizer in the Oval Office of the White House in Washington, Sept. 30, 2025. Advisers to Kennedy appear poised to make consequential changes to the childhood vaccination schedule, delaying a shot that is routinely administered to newborns and discussing big changes to when or how other childhood immunizations are given. (Pete Marovich/The New York Times)
Editorial: As CDC fades, others must provide vaccine advice

A CDC panel’s recommendation on the infant vaccine for hepatitis B counters long-trusted guidance.

toon
Editorial cartoons for Wednesday, Dec. 10

A sketchy look at the news of the day.… Continue reading

Welch: State’s business climate stifling; lawmakers aren’t helping

Now 45th for business in a recent 50-state survey, new tax proposals could make things even worse.

Douthat: White House needs more Christianity in its nationalism

Aside from blanket statements, the Trump administration seems disinterested in true Christian priorities.

Comment: Renewing ACA tax credits is a life or death issue

If subsidies aren’t renewed, millions will end coverage and put off life-saving preventative care.

Comment: CDC vaccine panel’s hep B reversal leads parents astray

It isn’t empowering parents to make their own decision; it’s misleading them in a dangerous direction.

toon
Editorial cartoons for Tuesday, Dec. 9

A sketchy look at the news of the day.… Continue reading

Customers look at AR-15-style rifles on a mostly empty display wall at Rainier Arms Friday, April 14, 2023, in Auburn, Wash. as stock dwindles before potential legislation that would ban future sale of the weapons in the state. House Bill 1240 would ban the future sale, manufacture and import of assault-style semi-automatic weapons to Washington State and would go into immediate effect after being signed by Gov. Jay Inslee. (AP Photo/Lindsey Wasson)
Editorial: Long fight for state’s gun safety laws must continue

The state’s assault weapons ban was upheld in a state court, but more challenges remain ahead.

Anne Sarinas, left, and Lisa Kopecki, right, sort ballots to be taken up to the election center to be processed on Nov. 3, 2025 in Everett, Washington. (Olivia Vanni / The Herald)
Editorial: States right to keep voter rolls for proper purpose

Trump DOJ’s demand for voters’ information is a threat to the integrity of elections.

Aleen Alshamman carries her basket as she picks out school clothes with the help of Operation School Bell volunteers on Sept. 24, 2025 in Everett, Washington. (Olivia Vanni / The Herald)
Editorial: Feeling generous? Your help is needed here, elsewhere

Giving Tuesday invites your financial support and volunteer hours for worthy charities and nonprofits.

Comment: FDA’s vaccine memo reckless, dangerous to public health

It offers no supporting evidence for its claims of children’s deaths and talks vaguely of broad changes.

Support local journalism

If you value local news, make a gift now to support the trusted journalism you get in The Daily Herald. Donations processed in this system are not tax deductible.