Getty Images

Getty Images

Editorial: Schools need to adopt policies on use of AI tools

Districts, like Edmonds, have codes that outline rules that prevent abuse yet allow student use.

By The Herald Editorial Board

The debate over technology in the classroom likely reaches back across millenniums to the days of the abacus.

“My kids don’t need to learn how to move beads up and down rods on a frame; that’s cheating. Counting on fingers and toes was good enough for me.”

While the modes of technology have evolved, much of the discussion sounds the same — whether the fears were over Cliff’s Notes or filmed versions of classic literature, on up to the use of calculators, computers and the internet — fears that kids could use the latest tools to simplify assignments and potentially cheat on essays and tests.

Those fears have been further amplified with the seemingly lightning-speed development in recent years of generative artificial intelligence and programs such as Chat GPT. At its most basic, students can take a teacher’s assignment and turn it into a prompt for the chatbot: “Write a 750-word essay on the outcomes of the Louisiana Purchase in the style of a high school sophomore.” Seconds later, copy, paste and turn in the assignment.

It hasn’t taken long for teachers and instructors to keep an eye peeled for such work-avoidance and tell the difference between student and microchip. And some teachers are using AI itself to aid in the detection.

Snohomish High School teacher Kathy Purviance-Snow, profiled by Herald Reporter Aina de Lapparent Alvarez in a report on the issue in Thursday’s Herald, uses an AI program’s algorithm to predict a certain percentage likelihood that a turned-in assignment was lifted from a generative AI program. Anything above 25 percent raises suspicions; and 50 percent or higher ranks as “suspect.”

Purviance-Snow gives the student a chance to admit the computer-assisted plagiarism, then redo the assignment.

“I try to err on the side of grace because kids make mistakes, people make mistakes,” she told The Herald.

Yet, Purviance-Snow’s method of dealing with student abuse of AI isn’t universal in schools. In fact, school districts in Snohomish County have taken different approaches toward the existence and use of AI, de Lapparent Alvarez reported. Everett and Snohomish school districts have blocked access in the short-term to generative AI software on district-owned and student-used tablets, laptops and computers, while other districts, including Edmonds and Lake Stevens have decided not to ban the software’s use.

Higher education also is having to address the issue. Everett Community College instructors don’t use detection software supplied by the college, though some instructors have asked for it. Others have concerns about mistakes in detection that might cast unfair suspicion on some students.

School districts and higher education have three options in addressing the technology, according to a report last August by the Brookings Institution: banning it, integrating it or placing it under review.

Last May, both the New York Public Schools and Los Angeles United schools blocked access to Chat GPT from school Wi-Fi networks and devices, a tack taken by districts across the country, according to the Brookings report. But within four months of its ban, New York Public Schools reversed its decision and worked with industry representatives and educators to look for avenues to use AI in education, rather than wait it out in hopes it would go away.

As well, there are discussions at all levels of education about the potential benefits that generative AI and the range of tools its powers can provide in education, including opportunities for learning and evaluations and demonstrating how teachers — and students — can learn generative AI’s weaknesses and strengths. The same software that can be used to pass off an essay as a student’s work can also be used in the classroom to assist students whose first language isn’t English, providing instant translation.

A combination of review and integration appears to be the path chosen by the Edmonds School District. The district’s director of technology, Chris Bailey, last year led a school board work session on artificial intelligence that, with assistance from the state Office of the Superintendent of Public Instruction and the Association of Educational Service Districts, helped develop a Student AI Code of Conduct, which outlines student responsibilities as well as recommendations for how AI should and shouldn’t be used by students.

The code, while finding a place in education for AI, is clear that “teachers must have access to students’ authentic displays of learning.”

Edmonds’ review and integration provides a model for other school districts, as does guidance from the state OSPI on “Human-Centered AI,” advising school district administrators, teachers, students and families.

The 19-page report advocates for an approach that starts “with human inquiry and always ends with human reflection, human insight and human empowerment,” using AI as a tool and not the end product. The guidance recommends developing students’ AI literacy; outlining the ethical, equitable and safe use of AI; provision of professional development and support for educators; applying human-centered design that provides transparency into how AI tools are developed, tested and reviewed; and aligning AI tools with the best practices and principles of education.

With the increasing realization of both the benefits and risks inherent in generative AI, it was perhaps a happy accident that school districts were left in an ad-hoc environment that allowed them to develop a range of solutions and approaches. Now the best of those guidelines and guardrails can be adopted and further refined statewide and nationwide.

Humans in witnessing the products of AI — especially in its ability to gather information and produce essays, art and more in mere seconds — can feel intimidated; until one realizes that AI can’t function unless able to draw from the collected knowledge that humans have developed.

In its highest and best use, AI should be like the abacus: a tool, in the control of human hands that have been shown how to use it.

Talk to us

> Give us your news tips.

> Send us a letter to the editor.

> More Herald contact information.

More in Opinion

toon
Editorial cartoons for Wednesday, Feb. 19

A sketchy look at the news of the day.… Continue reading

FILE - In this Friday, Oct. 4, 2019, file photo, a man using an electronic cigarette exhales in Mayfield Heights, Ohio. On Tuesday, Nov. 19, 2019, the American Medical Association said it is calling for an immediate ban on all electronic cigarette and vaping devices. (AP Photo/Tony Dejak, File)
Editorial: Shut down flavored tobacco’s gateway to youths

Legislation in Olympia would bar the use of flavors and menthol in vape products and cigarettes.

Glad to see right-of-center opinion in Herald

I’ve been a subscriber to the Daily Herald for nearly 20 years.… Continue reading

Which great years are we returning to, President Trump?

A couple of things that I would like to see answered by… Continue reading

Congressional Republicans should stand for beliefs

Dear Republicans, I understand your conundrum. I know you feel you must… Continue reading

Comment: Musk’s shoddy aim at USAID hit U.S. farmers, too

The agency paid farmers $2 billion for crops to feed a hungry world and to support a vital industry.

Gesen: A New York lesson on our choices as collective hostages

Seven U.S. attorneys general stepped down rather than go against their morals. How will we react?

toon
Editorial cartoons for Tuesday, Feb. 18

A sketchy look at the news of the day.… Continue reading

Herald report of Everett protest inaccurate, biased

I was at the rally and protest in Everett last on Feb.… Continue reading

Media shouldn’t use ‘she’ for trans people

About 79 percent of Americans oppose those observed male at birth from… Continue reading

USAID freeze halts vital aid work

I am outraged the Trump administration is making the U.S. weaker in… Continue reading

Goldberg: Trump declares war on higher ed, not just woke parts

The move, aided by Elon Musk, to gut NIH funding, is part of a larger and debilitating attack on academia.

Support local journalism

If you value local news, make a gift now to support the trusted journalism you get in The Daily Herald. Donations processed in this system are not tax deductible.