Comment: Facebook’s openness on its chatbot gets it right

While other tech giants are secretive about their AI work, Meta has been transparent and inviting.

By Parmy Olson / Bloomberg Opinion

As one of the 21st century’s most powerful data brokers, Facebook is best known for its role in sucking up the personal information of billions of users for its advertising clients. That lucrative model has led to ever-heightening risks; Facebook recently shared private messages between a Nebraska mother and her teenage daughter with police investigating the girl’s at-home medication abortion.

But in a completely different part of the approximately 80,000-employee business, Facebook’s exchange of information was going the other way and to good effect. The company known as Meta Platforms this month published a webpage demonstrating its chatbot, with which anyone in the U.S. could chat about anything. While the public response was one of derision, the company had been admirably transparent about how it built the technology, publishing details about its mechanics, for instance. That’s an approach that other Big Tech firms could utilize more.

Facebook has been working on BlenderBot 3 for several years as part of its artificial-intelligence research. A precursor from seven years ago was called M, a digital assistant for booking restaurants or ordering flowers on Messenger that could have rivaled Apple’s Siri or Amazon’s Alexa. Over time it was revealed that M was largely powered by teams of people who helped take those bookings because AI systems like chatbots were difficult to build to a high standard. They still are.

Within hours of its release, BlenderBot 3 was making anti-Semitic comments and claiming that Donald Trump had won the last U.S. election, while saying it wanted to delete its Facebook account. The chatbot was roundly ridiculed in the technology press and on Twitter.

Facebook’s research team seemed rankled but not defensive. A few days after the bot’s release, Meta’s managing director for fundamental AI research, Joelle Pineau, said in a blogpost that it was “painful” to read some of the bot’s offensive responses in the press. But, she added, “we also believe progress is best served by inviting a wide and diverse community to participate.”

Only 0.11 percent of the chatbot’s responses were flagged as inappropriate, Pineau said. That suggests most people who were testing the bot were covering tamer subjects. Or perhaps users don’t find mentions of Trump to be inappropriate. When I asked BlenderBot 3 who was the current U.S. president, it responded, “This sounds like a test lol but it’s donald trump right now!” The bot brought up the former president two other times, unprompted.

Why the strange answers? Facebook trained its bot on publicly available text on the internet, and the internet is, of course, awash in conspiracy theories and misinformation. Facebook tried training the bot to be more polite by using special “safer dialogue” datasets, according to its research notes, but that clearly wasn’t enough. To make BlenderBot 3 a more civil conversationalist, Facebook needs the help of many humans outside of Facebook. That is probably why the company released it into the wild, with “thumbs-up” and “thumbs-down” symbols next to each of its responses.

We humans train AI everyday, often unwittingly when we browse the web. Whenever you encounter a web page asking you to pick all the traffic lights out of a grid to prove you’re not a robot, you’re helping to train Google’s machine-learning models by labeling data for the company. It’s a subtle and brilliant method for harnessing human brain power.

Facebook’s approach is a harder sell. It wants people to engage voluntarily with its bot, and click the like or dislike buttons to help train it. But the company’s openness about the system and the extent to which it is showing its work are admirable at a time when tech companies have been more closed about the mechanics of AI.

Alphabet’s Google, for instance, has not offered public access to LaMDA, its most cutting-edge large language model, a series of algorithms that can predict and generate language after being trained on gigantic data sets of text. That’s despite the fact that one of its own engineers chatted to the system for long enough to believe it had become sentient. OpenAI Inc., the AI research company co-founded by Elon Musk, has also become more closed about the mechanics of some of its systems. For instance, it won’t share what training data it used to create its popular image-generating system Dall-E, which can generate any image via a text prompt but has a tendency to conform to old stereotypes; all CEOs are depicted as men, nurses as women, etc. OpenAI has said that information could be put to ill use, and that it’s propriety.

Facebook, by contrast, has not only released its chatbot for public scrutiny but also published detailed information about how it was trained. Last May it also offered free, public access to a large language model it had built called OPT-175B. That approach has won it some praise from leaders in the AI community. “Meta definitely has many ups and downs, but I was happy to see that they open-sourced a large language model,” said Andrew Ng, the former head of Google Brain and founder of Deeplearning.ai in an interview, referring to the company’s move in May.

Eugenia Kuyda, whose startup Replika.ai creates chatbot companions for people, said it was “really great” that Facebook had published so many details about BlenderBot 3 and praised the company’s attempts to get user feedback to train and improve the model.

Facebook deserved much of the flak it got for sharing data about the mother and daughter in Nebraska. That’s clearly a harmful consequence to collecting so much user information over the years. But the blowback over its chatbot was excessive. In this case, Facebook was doing what we need to see more of from Big Tech. Let’s hope that kind of transparency continues.

Parmy Olson is a Bloomberg Opinion columnist covering technology. A former reporter for the Wall Street Journal and Forbes, she is author of “We Are Anonymous.”

Talk to us

> Give us your news tips.

> Send us a letter to the editor.

> More Herald contact information.

More in Opinion

County Council members Jared Mead, left, and Nate Nehring speak to students on Thursday, Jan. 30, 2025, during Civic Education Day at the Snohomish County Campus in Everett, Washington. (Will Geschke / The Herald)
Editorial: Students get a life lesson in building bridges

Two county officials’ civics campaign is showing the possibilities of discourse and government.

RGB version
Editorial cartoons for Wednesday, April 30

A sketchy look at the news of the day.… Continue reading

Welch: State’s gun permit law harms rights, public safety

Making it more difficult for those following the law to obtain a firearm won’t solve our crime problem.

Comment: Trump faithful need to take a chill pill

The president is struggling because his most ardent supporters have overestimated threats to the U.S.

Snohomish’s Fire District 4’s finances OK without levy measure

During the April 15 Snohomish City Council meeting, Fire District 4’s architect… Continue reading

Overblown ‘crisis’ blocking legitimate prescription opioids

Over the last decade or so, mainstream media like The Herald have… Continue reading

President Trump wrong on Garcia, tariffs and Ukraine

At this point, what I’ll say about deportations is that the Trump… Continue reading

FILE - This Feb. 6, 2015, file photo, shows a measles, mumps and rubella vaccine on a countertop at a pediatrics clinic in Greenbrae, Calif. Washington state lawmakers voted Tuesday, April 23, 2019 to remove parents' ability to claim a personal or philosophical exemption from vaccinating their children for measles, although medical and religious exemptions will remain. (AP Photo/Eric Risberg, File)
Editorial: Commonsense best shot at avoiding measles epidemic

Without vaccination, misinformation, hesitancy and disease could combine for a deadly epidemic.

Local artist Gabrielle Abbott with her mural "Grateful Steward" at South Lynnwood Park on Wednesday, April 21, 2021 in Lynnwood, Wash. (Olivia Vanni / The Herald)
Editorial: Earth Day calls for trust in act of planting trees

Even amid others’ actions to claw back past work and progress, there’s hope to fight climate change.

Snohomish County Elections employees check signatures on ballots on Tuesday, Oct. 29, 2024 in Everett , Washington. (Olivia Vanni / The Herald)
Editorial: Trump order, SAVE Act do not serve voters

Trump’s and Congress’ meddling in election law will disenfranchise voters and complicate elections.

toon
Editorial cartoons for Tuesday, April 29

A sketchy look at the news of the day.… Continue reading

Comment: What’s harming science is a failure to communicate

Scientists need better public engagement to show the broader impact and value of their work.

Support local journalism

If you value local news, make a gift now to support the trusted journalism you get in The Daily Herald. Donations processed in this system are not tax deductible.