Comment: It’s not just Facebook; colleges using algorithms, too

And while the intent is to help students, ‘predictive analytics’ can use race and income to students’ disadvantage.

By Shea Swauger / Special To The Washington Post

Imagine being rejected from a university or advised out of your major because you’re Black, or a woman, or a first-generation college student. Imagine learning that these decisions were made by predictive analytics software that you can’t object to or opt out of. Just over a decade ago, this seemed unlikely. Now it seems difficult to stop.

That may sound futuristic, but St. George’s Hospital Medical School in London deployed this technology as early as the 1980s. Administrators trained a predictive model using historical admissions data to determine who was accepted to the medical program. It was supposed to eliminate the biases of admissions officers; unsurprisingly, it reproduced a pattern of discrimination. The demographics of incoming students skewed heavily toward white men, forcing the school to stop the practice.

Today, this is the reality faced by millions of students. This year, The Markup reported that more than 500 universities use a single company’s predictive analytics product, which assigns students an academic “risk score” based on variables that are supposedly associated with people’s ability to succeed in college; including, at many schools, race. Black and Latino students were consistently rated as higher risk than their white or Asian peers.

ADVERTISEMENT
0 seconds of 0 secondsVolume 0%
Press shift question mark to access a list of keyboard shortcuts
00:00
00:00
00:00
 

The software also has a feature that encourages academic advisers to steer students toward majors where their risk scores are lower. Predictive analytics may be more likely to flag Black women as a risk in STEM majors; not because they’re bad at STEM but because STEM programs have historically been racist and sexist, making their race and gender appear as risk factors to the algorithm. Even when variables such as race or gender are removed, predictive analytics can still use proxy data to infer demographic information including ZIP code, test scores and socioeconomic status; factors directly affected by structural racism and sexism.

In theory, predictive analytics software might equip schools with information that could help them direct resources to the students who need the most help. In practice, however, it can easily prejudice an administrator’s or professor’s impression of their students’ interests and abilities.

It’s very difficult for people who are harmed by software to protest its use or prove that it discriminated against them. Tech companies often object to transparent audits of their code, calling them a violation of intellectual-property laws. Students usually aren’t aware that their schools are gathering data or using it to make decisions about them in the first place. When applying to college, prospective students may sign terms of service or general data-use agreements that give implicit permission to be tracked and evaluated; but they aren’t explicitly told how much impact these technologies can have on their lives.

So why do colleges and universities use predictive analytics? Higher ed has been on a decades-long defunding streak at the hands of states and the federal government, and at most institutions, students generate the majority of revenue. Recruiting students and keeping them in school is a financial necessity. While ed tech companies promise that their software can identify paying students who will stay enrolled until graduation, the evidence for their claims is mixed.

If this trend continues, tools like this could be used on students much earlier in their education. K-12 public school systems have very limited resources and are accountable to governments for things like test scores, which could make predictive analytics a tempting tool to try to “optimize” students’ performance. It could determine who’s selected for advanced classes or denied supplemental learning support, or what kind of accommodations are granted for a student with a disability. Technology like this often is implemented for one purpose but has a bad habit of expanding its scope.

Judging by last year’s wave of protests against universities’ use of facial recognition, students will continue to voice opposition to discriminatory technology. If faculty members become subject to these tools, they’ll probably protest, too. In response, tech companies will, as usual, step up their public relations efforts. That’s what they did when called upon to act more responsibly with artificial intelligence, creating “ethical” technology initiatives as cover while they fired the employees actually attempting to build more ethical technology.

Recent movements to ban facial recognition and remote test proctoring have shown how the public can effectively push back against the expanding technological panopticon. About 20 different jurisdictions, including cities such as Boston and Portland, Ore., and the state of Vermont, have banned certain uses of facial recognition; and there’s a good chance we’ll see more such regulations in the next few years. This burgeoning movement may help spark a broader public conversation that will shape emerging norms of privacy, informed consent, data rights and justice in technology.

Predictive analytics tends to encode historic harms into our planning, limiting our sense of possibility. The companies that sell it to educational institutions may have gee-whiz marketing campaigns all about the future. Ultimately, though, such technology will keep us bound to the past, stuck in our old patterns of oppression; unless we choose to imagine something better.

Shea Swauger is a librarian, researcher and doctoral student in education and critical studies at University of Colorado-Denver.

Talk to us

> Give us your news tips.

> Send us a letter to the editor.

> More Herald contact information.

More in Opinion

May 28, 2025: Trump Budget Bill
Editorial cartoons for Saturday, May 31

A sketchy look at the news of the day.… Continue reading

A rendering of the new vessels to be built for Washington State Ferries. (Washington State Ferries)
Editorial: Local shipyard should get shot to build state ferries

If allowed to build at least two ferries, Nichols Brothers can show the value building here offers.

Youth Forum: Zoos today provide education and protection

Zoos today allow better understanding of animal needs and are aiding in saving species from extinction.

Youth Forum: Students need hands-on learning of animal dissection

It can help students decide a career path in life sciences; because of USDA oversight it’s safe.

Forum: New stadium a civic project that can deliver on its vision

Along with keeping the AquaSox in town, it offers a wealth of broader public benefits for Everett.

Forum: Pope Leo’s election a welcome reminder to protect workers

His choice of Leo XIII as his namesake is important for his attitudes toward dignity, justice and labor.

The Buzz: On the menu: tacos, tainted lettuce, free-range ostrich

While Trump was enjoying TACO Tuesday, RFK Jr. had his eye on a wobble of bird flu-stricken ostriches.

Comment: Trump doesn’t want to fix Harvard; he wants to control it

Crippling Harvard and its students would hit all of higher ed and U.S. leadership in research and more.

toon
Editorial cartoons for Friday, May 30

A sketchy look at the news of the day.… Continue reading

Schwab: We’re witnesses to a new China syndrome

What’s melting down now, with America’s retreat from the world, is our standing and economic influence.

If you need a permit to purchase a gun, how about for voting?

Gov. Bob Ferguson signed House Bill 1163 into law requiring, among other… Continue reading

Trump agenda: Walls, dome and ‘Fortress America’

I’ve been looking at what this administration has been trying to accomplish… Continue reading

Support local journalism

If you value local news, make a gift now to support the trusted journalism you get in The Daily Herald. Donations processed in this system are not tax deductible.