By The Herald Editorial Board
Rest assured that among the 42 members of a proposed state task force looking at the potential benefits and risks posed by artificial intelligence — specifically the generative AI tools that are generating headlines for the works of art, term papers and virtual conversations they are “creating” with a few commands — that a chatbot won’t be included on the panel.
State legislation to create an artificial intelligence task force received one of the session’s first hearings Wednesday before the Senate Committee on Environment, Energy and Technology. Senate Bill 5838, sponsored by Sen. Joe Nguyen, D-White Center, at the request of state Attorney General Bob Ferguson, would establish the task force to assess how AI is used, develop guiding principles and make recommendations regarding its regulation.
“Washington is on the cutting edge of innovation,” Ferguson said in a release in December in seeking the legislation. “It is imperative that we embrace new technology in a thoughtful way. As we celebrate the benefits, we must also ensure we protect against the potential for irresponsible use and unintentional consequences.”
As a home for some of the companies developing AI, Washington should be leading discussion and exploration of the technology’s implications, good and bad. Colorado, Illinois, Vermont and Virginia have established similar task forces to study AI.
Among the potential findings and recommendations the Attorney General’s office expects the task force to consider:
Guiding principles for the use of generative AI;
Identification of high-risk uses, including those that concern the public’s safety and fundamental rights;
Support and protection of AI’s innovations;
Recommendations for public education on AI’s development and uses; and
A review of public policy regarding AI and its potential effects on the public, on historically excluded communities, racial equity, the workforce and ethical concerns.
The legislation schedules a preliminary report to the Legislature by Dec. 1, 2025, with final findings and recommendations by June 1, 2027.
Among those speaking at Wednesday’s hearing, there was general support for the proposal from representatives of the tech industry, business interests, retailers and labor, but some concern for the size of the task force and that such an effort might place too much emphasis on regulation.
As to the 42 members, the legislation outlines members of both parties from the Legislature’s chambers, as well as representatives of specific state agencies, the state’s tribes, law enforcement, racial and civil liberties advocacy groups, the tech industry, data privacy experts, consumer protection, labor, cybersecurity, retail, university or research institutions and students.
Fortunately, the legislation also allows that the task force’s meetings can take place by video conferencing, saving the search for a table big enough for 42 people. An earlier proposal, apparently, envisioned an even larger task force.
“We appreciate it was narrowed down from about 72, but it’s a very cumbersome size,” said Bob Battles, director of governmental affairs for the Association of Washington Business, which is specifically named as a member of the task force. “I appreciate having a spot on it, but we want to make sure this is an equitable set-up.”
Battles also expressed concern regarding regulation that the task force might propose. Artificial intelligence, Battles said, has been around for decades and despite the recent attention is not a new technology but one in constant development.
“So we need to make sure that we are looking at it, not to just regulate, but to make sure that we don’t stifle innovation,” he said.
As to the task force’s size, 42 may indeed be too cumbersome to result in effective consideration of the issues involved. Although Sen. Liz Lovelett, D-Anacortes, who presided over Wednesday’s hearing, noted that “42,” in the science fiction novel “Hitchhiker’s Guide to the Galaxy,” was the answer given by an artificial intelligence named “Deep Thought” to the “Ultimate Question of Life, the Universe and Everything.”
“So, maybe that will work,” Lovelett said.
Limiting the size of the task force won’t necessarily limit the range of experts and stakeholders who can provide useful testimony, data and perspective to the panel. We will note that among the 42 proposed members, there is no one representing those humans who have provided the works of art, photographs, video, written and spoken word and more on which generative AI tools, such as ChatGPT, were “trained” and from which they draw the information they use in their responses.
Battles and others noted that copyright law already is in place to regulate issues of use of existing content.
The New York Times has recently added to a long list of litigation against the developers of generative AI tools, suing Microsoft and OpenAI — the creators of ChatGPT — for scraping up content it produced over decades and using it to train AI technologies without consent or compensation.
But if the task force needs to avoid a bias toward regulation, the panel also will need to confront the assumption that AI is producing something entirely new on its own, for which the human user can take credit. The debt owed to the media on which AI is being trained shouldn’t be ignored.
Just as the world has benefited from advances in computer technology since the first bits and bytes, generative AI is poised to further expand those opportunities. But the recent pace of AI’s advancements and expanding capabilities warrants attention to its potential for abuse as much as its service.
Those intent on spreading disinformation for personal and political motives have eagerly delved into AI’s potential for the creation of convincing “deep fakes” showing celebrities, officials and everyday people doing and saying things they haven’t. Protection also should be considered for those who might be displaced from creative work, as movie and television writers demanded and won in contract negotiations.
Regardless of its size, a task force, conducting its work in full public view, is necessary to help prepare the state to limit any abuses and take full advantage of its technological achievements, guardrails that can keep it on task.
And no offense to the chatbots, but the humans can handle this.
Talk to us
> Give us your news tips.
> Send us a letter to the editor.
> More Herald contact information.