Comment: European Union starts work on AI rules for chatbots

The EU isn’t waiting to set guardrails for artificial intelligence.

By Lionel Laurent / Bloomberg Opinion

America innovates, Europe regulates.

Just as the world is starting to come to grips with OpenAI, whose boss Sam Altman has both leapfrogged the competition and pleaded for global rules, the European Union has responded with its own bid for AI superpower status: the Artificial Intelligence Act. The AI Act, which faced a European Parliament vote on Wednesday, would make the EU the first to set minimum standards.

Yet we’re a long way from the deceptively simple world of Isaac Asimov’s robot stories, which saw sentient machines deliver the benefits of powerful “positronic brains” with just three rules in place: don’t harm humans, obey humans and defend your existence. AI is clearly too important to not regulate thoroughly, but the EU will have its work cut out to reduce the AI Act’s complexity while promoting innovation.

The AI Act has some good ideas on transparency and trust: Chatbots will have to declare whether they’re trained on copyrighted material, deepfakes will have to be labeled, and obligations for the kind of models used in generative AI will require serious efforts to catalog datasets and take responsibility for how they’re used.

Lifting the lid on opaque machines that process huge swaths of human output is the right idea, and gets us closer to more dignity around data treatment. As Dragos Tudorache, co-rapporteur of the law, told me recently, the purpose is to promote “trust and confidence” in a technology that has attracted huge amounts of investment and excitement yet also produced some very dark failures. Self-regulation isn’t an option; neither is “running into the woods” and doing nothing out of fear that AI could wipe out humanity one day, he said.

The AI Act carries complexity, however, and runs the paradoxical risk of setting the bar too high to promote innovation but not high enough to avoid unpredictable outcomes. The main approach is to categorize AI applications into buckets of risk, from minimal (spam filters, video games) to high (workplace recruitment) to unauthorized (real-time facial recognition).

That makes sense from a product-safety point of view, with AI system providers expected to meet rules and requirements before putting their products on the market. Yet the category of high-risk applications is a broad one, and the downstream chain of responsibility in an application like ChatGPT shows how tech can blur product-safety frameworks. When a lawyer relies on AI to craft a motion that unwittingly becomes full of made-up case law, are they using the product as intended or misusing it?

It’s also not clear how this will work with other data-privacy laws like the EU’s General Data Protection Regulation, which Italy used as justification for a temporary block on ChatGPT. And while more transparency on copyright-protected training data makes sense, it could conflict with past copyright exceptions granted for data mining back when creative industries viewed AI with less caution.

There’s a possibility the actual outcome of the AI Act might entrench the EU’s dependency on big U.S. tech firms like Microsoft and Nvidia. European firms are champing at the bit to tap into potential productivity benefits of AI, but it’s likely the large incumbent providers will be best-positioned to handle the estimated $3 billion upfront compliance costs and non-compliance fines of up to 7 precent of global revenue.

Adobe has already offered to legally compensate businesses if they’re sued for copyright infringement over any images its Firefly tool creates, according to Fast Company. Some firms may take the calculated risk of avoiding the EU entirely — Google’s parent company Alphabet has yet to make its chatbot Bard available there.

The EU has fine-tuning to do as final negotiations begin on the AI Act, which might not come into force until 2026. Countries like France that are nervous about losing more innovation ground to the U.S. will likely push for more exemptions for smaller businesses. Bloomberg Intelligence analyst Tamlin Bason said he sees a possible “middle ground” on restrictions, which should leave room for initiatives to foster new tech ideas such as promoting ecosystems linking universities, startups and investors. There should be more global coordination at a time when angst around AI is widespread; the G7’s new Hiroshima AI process looks like a useful forum to discuss issues like intellectual property rights.

Perhaps one bit of good news is that AI is not about to destroy all jobs held by human compliance officers and lawyers. Technology consultant Barry Scannell said that companies will be looking at hiring AI officers and drafting AI impact assessments, similar to what happened in the aftermath of the GDPR.

Reining in the robots requires more human brainpower; perhaps one twist you won’t get in an Asimov story.

Lionel Laurent is a Bloomberg Opinion columnist covering digital currencies, the European Union and France. Previously, he was a reporter for Reuters and Forbes.

Talk to us

> Give us your news tips.

> Send us a letter to the editor.

> More Herald contact information.

More in Opinion

toon
Editorial cartoons for Sunday, May 19

A sketchy look at the news of the day.… Continue reading

Snohomish County Councilmembers Nate Nehring, left, and Jared Mead, speaking, take turns moderating a panel including Tulip Tribes Chairwoman Teri Gobin, Stanwood Mayor Sid Roberts and Lynnwood Mayor Christine Frizzell during the Building Bridges Summit on Monday, Dec. 4, 2023, at Western Washington University Everett in Everett, Washington. (Ryan Berry / The Herald)
Editorial: Candidates, voters have campaign promises to make

Two county officials’ efforts to improve political discourse skills are expanding to youths and adults.

Eco-nomics: What it takes to take carbon out of energy

The transition to clean energy demands investment in R&D and the grid and streamlining processes.

Goal isn’t to ban plastic but to use much less of it

A recent letter lauded the use of plastic in health care. Plastics… Continue reading

Do newscasters need some help with pronunciation?

Having been a teacher in public schools, I am appalled at the… Continue reading

Recycle that uncivil tone along with your plastic

I write to request two things: that writers of letters the editor… Continue reading

More races to vote in besides U.S. president; please vote

I am hoping most people will vote in the next election. I… Continue reading

Attorney General Bob Ferguson speaks to a reporter as his 2024 gubernatorial campaign launch event gets underway in Seattle, on Saturday, Sept. 9, 2023. ( Jerry Cornfield/Washington State Standard)
Editorial: Recruiting two Bob Fergusons isn’t election integrity

A GOP activist paid the filing fee for two gubernatorial candidates who share the attorney general’s name.

Comment: Passing I-2117 would blast hole in transportation fixes

The measure would cut $5.4 billion in funding from work underway on roads, ferries and more.

Amtrak Cascades train 517 from Vancouver to Portland arrives at Everett Station Thursday, March 9, 2023, in downtown Everett, Washington. (Ryan Berry / The Herald)
Forum: Taking the train must be made better travel alternative

State officials need to make the Amtrak Cascades route faster, increasing its value as an option to I-5.

Foster parent abstract concept vector illustration. Foster care, father in adoption, happy interracial family, having fun, together at home, childless couple, adopted child abstract metaphor.
Editorial: State must return foster youths’ federal benefits

States, including Washington, have used those benefits, rather than hold them until adulthood.

Making adjustments to keep Social Security solvent represents only one of the issues confronting Congress. It could also correct outdated aspects of a program that serves nearly 90 percent of Americans over 65. (Stephen Savage/The New York Times) -- NO SALES; FOR EDITORIAL USE ONLY WITH NYT STORY SLUGGED SCI SOCIAL SECURITY BY PAULA SPAN FOR NOV. 26, 2018. ALL OTHER USE PROHIBITED.
Editorial: Social Security’s good news? Bad news delayed a bit

Congress has a little additional time to make sure Social Security is solvent. It shouldn’t waste it.

Support local journalism

If you value local news, make a gift now to support the trusted journalism you get in The Daily Herald. Donations processed in this system are not tax deductible.