Comment: It’s not just Facebook; colleges using algorithms, too

And while the intent is to help students, ‘predictive analytics’ can use race and income to students’ disadvantage.

By Shea Swauger / Special To The Washington Post

Imagine being rejected from a university or advised out of your major because you’re Black, or a woman, or a first-generation college student. Imagine learning that these decisions were made by predictive analytics software that you can’t object to or opt out of. Just over a decade ago, this seemed unlikely. Now it seems difficult to stop.

That may sound futuristic, but St. George’s Hospital Medical School in London deployed this technology as early as the 1980s. Administrators trained a predictive model using historical admissions data to determine who was accepted to the medical program. It was supposed to eliminate the biases of admissions officers; unsurprisingly, it reproduced a pattern of discrimination. The demographics of incoming students skewed heavily toward white men, forcing the school to stop the practice.

Today, this is the reality faced by millions of students. This year, The Markup reported that more than 500 universities use a single company’s predictive analytics product, which assigns students an academic “risk score” based on variables that are supposedly associated with people’s ability to succeed in college; including, at many schools, race. Black and Latino students were consistently rated as higher risk than their white or Asian peers.

The software also has a feature that encourages academic advisers to steer students toward majors where their risk scores are lower. Predictive analytics may be more likely to flag Black women as a risk in STEM majors; not because they’re bad at STEM but because STEM programs have historically been racist and sexist, making their race and gender appear as risk factors to the algorithm. Even when variables such as race or gender are removed, predictive analytics can still use proxy data to infer demographic information including ZIP code, test scores and socioeconomic status; factors directly affected by structural racism and sexism.

In theory, predictive analytics software might equip schools with information that could help them direct resources to the students who need the most help. In practice, however, it can easily prejudice an administrator’s or professor’s impression of their students’ interests and abilities.

It’s very difficult for people who are harmed by software to protest its use or prove that it discriminated against them. Tech companies often object to transparent audits of their code, calling them a violation of intellectual-property laws. Students usually aren’t aware that their schools are gathering data or using it to make decisions about them in the first place. When applying to college, prospective students may sign terms of service or general data-use agreements that give implicit permission to be tracked and evaluated; but they aren’t explicitly told how much impact these technologies can have on their lives.

So why do colleges and universities use predictive analytics? Higher ed has been on a decades-long defunding streak at the hands of states and the federal government, and at most institutions, students generate the majority of revenue. Recruiting students and keeping them in school is a financial necessity. While ed tech companies promise that their software can identify paying students who will stay enrolled until graduation, the evidence for their claims is mixed.

If this trend continues, tools like this could be used on students much earlier in their education. K-12 public school systems have very limited resources and are accountable to governments for things like test scores, which could make predictive analytics a tempting tool to try to “optimize” students’ performance. It could determine who’s selected for advanced classes or denied supplemental learning support, or what kind of accommodations are granted for a student with a disability. Technology like this often is implemented for one purpose but has a bad habit of expanding its scope.

Judging by last year’s wave of protests against universities’ use of facial recognition, students will continue to voice opposition to discriminatory technology. If faculty members become subject to these tools, they’ll probably protest, too. In response, tech companies will, as usual, step up their public relations efforts. That’s what they did when called upon to act more responsibly with artificial intelligence, creating “ethical” technology initiatives as cover while they fired the employees actually attempting to build more ethical technology.

Recent movements to ban facial recognition and remote test proctoring have shown how the public can effectively push back against the expanding technological panopticon. About 20 different jurisdictions, including cities such as Boston and Portland, Ore., and the state of Vermont, have banned certain uses of facial recognition; and there’s a good chance we’ll see more such regulations in the next few years. This burgeoning movement may help spark a broader public conversation that will shape emerging norms of privacy, informed consent, data rights and justice in technology.

Predictive analytics tends to encode historic harms into our planning, limiting our sense of possibility. The companies that sell it to educational institutions may have gee-whiz marketing campaigns all about the future. Ultimately, though, such technology will keep us bound to the past, stuck in our old patterns of oppression; unless we choose to imagine something better.

Shea Swauger is a librarian, researcher and doctoral student in education and critical studies at University of Colorado-Denver.

Talk to us

> Give us your news tips.

> Send us a letter to the editor.

> More Herald contact information.

More in Opinion

toon
Editorial cartoons for Monday, April 21

A sketchy look at the news of the day.… Continue reading

Snohomish County Elections employees check signatures on ballots on Tuesday, Oct. 29, 2024 in Everett , Washington. (Olivia Vanni / The Herald)
Editorial: Trump order, SAVE Act do not serve voters

Trump’s and Congress’ meddling in election law will disenfranchise voters and complicate elections.

Comment: RFK Jr. isn’t interested in finding cause of autism

His laughable five-month timeline and lack of understanding point to an intention to blame vaccines.

Brooks: Trump divides and conquers; we must unite and build

In his isolated attacks, Trump has divided our loyalties. It’s time for a civic and civil uprising.

Harrop: Trump’s war against elite universities is a smokescreen

Washington’s conservatives are enthralled by the Ivies. The ultimatums are simply a distraction.

Stephens: Solving ‘Iran problem’ is about more than the bomb

To eliminate the threat, an agreement must seek an exchange of ‘normal for normal.’ That won’t be easy.

Payton Pavon-Garrido, 23, left, and Laura Castaneda, 28, right, push the ballots into the ballot drop box next to the Snohomish County Auditor’s Office on Tuesday, Nov. 5, 2024 in Everett, Washington. (Olivia Vanni / The Herald)
Comment: States make the call as to who votes; not Congress

If the SAVE Act’s voter restrictions are adopted, Congress may find it overstepped its authority.

Allow all to opt back in to long-term care benefit program

Last November, Washingtonians voted to protect our long-term care program, and soon,… Continue reading

Message, support in Everett Hands Off protest are clear

The fabulously large crowd in Everett reflected a nationwide trend involving millions… Continue reading

Everett City Council: Rhyne dedicated, compassionate

Recently, like many of us, I attended the Hands Off event put… Continue reading

Trump’s comments about Jews, Hitler intolerable

News reports tell us that when he was speaking with Benjamin Netanyahu… Continue reading

Considering Trump’s bankruptcies is he right man for the job?

Since Donald Trump declared bankruptcy six times in his real estate business,… Continue reading

Support local journalism

If you value local news, make a gift now to support the trusted journalism you get in The Daily Herald. Donations processed in this system are not tax deductible.