Getty Images

Getty Images

Editorial: Schools need to adopt policies on use of AI tools

Districts, like Edmonds, have codes that outline rules that prevent abuse yet allow student use.

By The Herald Editorial Board

The debate over technology in the classroom likely reaches back across millenniums to the days of the abacus.

“My kids don’t need to learn how to move beads up and down rods on a frame; that’s cheating. Counting on fingers and toes was good enough for me.”

While the modes of technology have evolved, much of the discussion sounds the same — whether the fears were over Cliff’s Notes or filmed versions of classic literature, on up to the use of calculators, computers and the internet — fears that kids could use the latest tools to simplify assignments and potentially cheat on essays and tests.

Those fears have been further amplified with the seemingly lightning-speed development in recent years of generative artificial intelligence and programs such as Chat GPT. At its most basic, students can take a teacher’s assignment and turn it into a prompt for the chatbot: “Write a 750-word essay on the outcomes of the Louisiana Purchase in the style of a high school sophomore.” Seconds later, copy, paste and turn in the assignment.

It hasn’t taken long for teachers and instructors to keep an eye peeled for such work-avoidance and tell the difference between student and microchip. And some teachers are using AI itself to aid in the detection.

Snohomish High School teacher Kathy Purviance-Snow, profiled by Herald Reporter Aina de Lapparent Alvarez in a report on the issue in Thursday’s Herald, uses an AI program’s algorithm to predict a certain percentage likelihood that a turned-in assignment was lifted from a generative AI program. Anything above 25 percent raises suspicions; and 50 percent or higher ranks as “suspect.”

Purviance-Snow gives the student a chance to admit the computer-assisted plagiarism, then redo the assignment.

“I try to err on the side of grace because kids make mistakes, people make mistakes,” she told The Herald.

Yet, Purviance-Snow’s method of dealing with student abuse of AI isn’t universal in schools. In fact, school districts in Snohomish County have taken different approaches toward the existence and use of AI, de Lapparent Alvarez reported. Everett and Snohomish school districts have blocked access in the short-term to generative AI software on district-owned and student-used tablets, laptops and computers, while other districts, including Edmonds and Lake Stevens have decided not to ban the software’s use.

Higher education also is having to address the issue. Everett Community College instructors don’t use detection software supplied by the college, though some instructors have asked for it. Others have concerns about mistakes in detection that might cast unfair suspicion on some students.

School districts and higher education have three options in addressing the technology, according to a report last August by the Brookings Institution: banning it, integrating it or placing it under review.

Last May, both the New York Public Schools and Los Angeles United schools blocked access to Chat GPT from school Wi-Fi networks and devices, a tack taken by districts across the country, according to the Brookings report. But within four months of its ban, New York Public Schools reversed its decision and worked with industry representatives and educators to look for avenues to use AI in education, rather than wait it out in hopes it would go away.

As well, there are discussions at all levels of education about the potential benefits that generative AI and the range of tools its powers can provide in education, including opportunities for learning and evaluations and demonstrating how teachers — and students — can learn generative AI’s weaknesses and strengths. The same software that can be used to pass off an essay as a student’s work can also be used in the classroom to assist students whose first language isn’t English, providing instant translation.

A combination of review and integration appears to be the path chosen by the Edmonds School District. The district’s director of technology, Chris Bailey, last year led a school board work session on artificial intelligence that, with assistance from the state Office of the Superintendent of Public Instruction and the Association of Educational Service Districts, helped develop a Student AI Code of Conduct, which outlines student responsibilities as well as recommendations for how AI should and shouldn’t be used by students.

The code, while finding a place in education for AI, is clear that “teachers must have access to students’ authentic displays of learning.”

Edmonds’ review and integration provides a model for other school districts, as does guidance from the state OSPI on “Human-Centered AI,” advising school district administrators, teachers, students and families.

The 19-page report advocates for an approach that starts “with human inquiry and always ends with human reflection, human insight and human empowerment,” using AI as a tool and not the end product. The guidance recommends developing students’ AI literacy; outlining the ethical, equitable and safe use of AI; provision of professional development and support for educators; applying human-centered design that provides transparency into how AI tools are developed, tested and reviewed; and aligning AI tools with the best practices and principles of education.

With the increasing realization of both the benefits and risks inherent in generative AI, it was perhaps a happy accident that school districts were left in an ad-hoc environment that allowed them to develop a range of solutions and approaches. Now the best of those guidelines and guardrails can be adopted and further refined statewide and nationwide.

Humans in witnessing the products of AI — especially in its ability to gather information and produce essays, art and more in mere seconds — can feel intimidated; until one realizes that AI can’t function unless able to draw from the collected knowledge that humans have developed.

In its highest and best use, AI should be like the abacus: a tool, in the control of human hands that have been shown how to use it.

Talk to us

> Give us your news tips.

> Send us a letter to the editor.

> More Herald contact information.

More in Opinion

Editorial cartoons for Thursday, April 18

A sketchy look at the news of the day.… Continue reading

Snow dusts the treeline near Heather Lake Trailhead in the area of a disputed logging project on Tuesday, April 11, 2023, outside Verlot, Washington. (Ryan Berry / The Herald)
Editorial: Move ahead with state forests’ carbon credit sales

A judge clears a state program to set aside forestland and sell carbon credits for climate efforts.

State needs to assure better rail service for Amtrak Cascades

The Puget Sound region’s population is expected to grow by 4 million… Continue reading

Trump’s own words contradict claims of Christian faith

In a recent letter to the editor regarding Christians and Donald Trump,… Continue reading

Comment: Israel should choose reasoning over posturing

It will do as it determines, but retaliation against Iran bears the consequences of further exchanges.

Comment: Ths slow but sure progress of Brown v. Board

Segregation in education remains, as does racism, but the case is a milestone of the 20th century.

A new apple variety, WA 64, has been developed by WSU's College of Agricultural, Human and Natural Resource Sciences. The college is taking suggestions on what to name the variety. (WSU)
Editorial: Apple-naming contest fun celebration of state icon

A new variety developed at WSU needs a name. But take a pass on suggesting Crispy McPinkface.

Liz Skinner, right, and Emma Titterness, both from Domestic Violence Services of Snohomish County, speak with a man near the Silver Lake Safeway while conducting a point-in-time count Tuesday, Jan. 23, 2024, in Everett, Washington. The man, who had slept at that location the previous night, was provided some food and a warming kit after participating in the PIT survey. (Ryan Berry / The Herald)
Editorial: Among obstacles, hope to curb homelessness

Panelists from service providers and local officials discussed homelessness’ interwoven challenges.

FILE - In this photo taken Oct. 2, 2018, semi-automatic rifles fill a wall at a gun shop in Lynnwood, Wash. Gov. Jay Inslee is joining state Attorney General Bob Ferguson to propose limits to magazine capacity and a ban on the sale of assault weapons. (AP Photo/Elaine Thompson, File)
Editorial: ‘History, tradition’ poor test for gun safety laws

Judge’s ruling against the state’s law on large-capacity gun clips is based on a problematic decision.

This combination of photos taken on Capitol Hill in Washington shows Rep. Cathy McMorris Rodgers, R-Wash., on March 23, 2023, left, and Sen. Maria Cantwell, D-Wash., on Nov. 3, 2021. The two lawmakers from opposing parties are floating a new plan to protect the privacy of Americans' personal data. The draft legislation was announced Sunday, April 7, 2024, and would make privacy a consumer right and set new rules for companies that collect and transfer personal data. (AP Photo)
Editorial: Adopt federal rules on data privacy and rights

A bipartisan plan from Sen. Cantwell and Rep. McMorris Rodgers offers consumer protection online.

Students make their way through a portion of a secure gate a fence at the front of Lakewood Elementary School on Tuesday, March 19, 2024 in Marysville, Washington. Fencing the entire campus is something that would hopefully be upgraded with fund from the levy. (Olivia Vanni / The Herald)
Editorial: Levies in two north county districts deserve support

Lakewood School District is seeking approval of two levies. Fire District 21 seeks a levy increase.

Editorial cartoons for Wednesday, April 17

A sketchy look at the news of the day.… Continue reading

Support local journalism

If you value local news, make a gift now to support the trusted journalism you get in The Daily Herald. Donations processed in this system are not tax deductible.