WOODINVILLE — Henry Soto first learned he’d been drafted into the fight while on a business trip in British Columbia.
It was 2008. The Woodinville-area man had spent much of two years navigating the twisting path of vendor jobs and serial interviews often required to land a coveted full-time position with Microsoft Corp. Now, he was facing another reality of working for the tech giant. The message said he’d been assigned new duties under one of the company’s periodic “reorgs.”
Only later did he understand what the job entailed.
He found himself seated in a cubicle, spending every work day closely scrutinizing toxic digital content.
Child pornography. Bestiality. Videos of people being raped, tortured, murdered.
Soto was repelled by the images filling his computer screen. But he couldn’t turn away.
Those were real people being brutalized, many of them children.
His employer gave him a mission, one he had never imagined let alone prepared for.
“I just was in the war one day,” Soto said.
He spent much of six years bringing what he found to the attention of authorities worldwide.
Fast forward to 2017.
Soto, 37, now is living with a diagnosis of post-traumatic stress disorder. His health care providers have linked his anxiety, memory troubles and other crippling symptoms to repeated exposure to the materials he helped remove from digital space.
A typical winter’s evening finds him at home, the shades drawn, the doors double-locked. Indoors or out, he routinely wears sunglasses and a hoodie sweatshirt pulled up over his head to dampen a hypersensitive startle reflex.
Soto fears that child pornographers and other criminals he helped shut down may attempt revenge. Meanwhile, his existence has narrowed to a minute-by-minute exercise in avoiding things that cause his mind to leap into the past, forcing him through flashbacks or nightmares to again bear witness to humanity at its worst.
Among the tech worker’s triggers are kids and computers. His wife, Sara, continues to work at Microsoft. The Sotos have a child in elementary school.
Ben Wells is an Arlington attorney, perhaps best known for helping his neighbors navigate the tangle of trauma and bureaucratic red tape spawned by the deadly Oso mudslide. He’s now trying to help the Sotos get their lives back.
In December, Wells joined Seattle attorney Becky Roe in filing a King County Superior Court lawsuit against Microsoft.
Roe, a fierce civil litigator and former deputy prosecutor renowned for her pursuit of sex offenders, represents Greg Blauert, a former Soto co-worker who is living with similar damage.
In court papers, the attorneys say Microsoft negligently failed to protect employees from foreseeable workplace harm and didn’t react with sufficient care when it became clear Soto, Blauert and others were suffering.
“Defendant Microsoft created the Online Safety team to protect its customers from viewing unsafe and highly disturbing images that could cause their customers harm. Toxic images cannot be ‘unseen,’” the lawsuit says. Microsoft “knew or should have known that the same images could reasonably cause harm to its employees.”
The company as of last week had not filed a formal response to the lawsuit. It did release a written statement (published in full below).
“We disagree with the plaintiffs’ claims,” the statement said. “Microsoft takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work.”
The statement describes what Microsoft is doing to protect staff engaged in work Soto once performed. It doesn’t address central questions in the lawsuit: What the corporation knew about the potential for harm, and when.
There is no dispute that Microsoft and other tech companies were on notice that their products were being used in ways that caused harm. That included making it easier for child pornographers to share images and for terrorists to traffic in videos of killings.
In 2008, new federal legislation required tech companies to report child pornography to the National Center for Missing and Exploited Children. Microsoft unleashed computer programs to flag objectionable content.
The programs keyed in on photos, video and other computer files that potentially were in violation of the company’s terms of service — or worse, contained evidence of criminal activity. Humans still were needed to eyeball the material and make judgment calls.
At first, Soto and other team members were placed together in a typical cubicle farm. Before long they were moved behind closed doors. Other workers complained about the disturbing content on their computer screens, he said.
Online Safety workers were required to do more than simply glance at an objectionable image or video. They pored over the content, filling out digital forms that described in clinical terms what they’d found.
Soto is proud to have played a role in developing tools still being used. Pleadings filed in the lawsuit describe how he worked with experts to create a matrix of maturation differences in the bodies of young people around the world as one means to help his team more accurately judge the age of those photographed and videotaped engaged in sexual activity.
He also was involved in using technology to identify “photo DNA.” That is code, Soto said, that sticks with an image regardless of later edits. In theory, that meant servers and other digital space could be searched to identify multiple users trafficking in specific illegal content, a potential boon to investigators targeting child porn rings.
By 2009, Microsoft began providing counseling for Online Safety team members. Those who participated were told about “compassion fatigue” and the company’s wellness program, the lawsuit says.
“They were not told that the more they became invested in saving people, the less able they would become to recognize and act on their own symptoms of PTSD,” the lawsuit says. The person who ran the program lacked the training and experience to recognize the risks, the attorneys contend. Moreover, the counselor lacked authority to demand changes to protect workers, including rotating employees to other duties so they wouldn’t burn out.
Soto found the counseling ineffective. Others on his team “psychologically and emotionally broke down in the presence of directors and supervisors,” the lawsuit says.
Early in 2010, Soto began seeing a psychiatrist who placed him on medications that were helpful in managing his symptoms. His problems were diagnosed as stemming from on-the-job PTSD.
Soto still couldn’t look away. While the work came at a cost — sleep problems, irritability, difficulty maintaining focus — there were victories. He and other team members took quiet satisfaction over headlines about child porn rings being dismantled.
He also worried about letting down others.
Microsoft contracted with companies in the Philippines to help screen problematic content. Soto said he was tapped to help manage the arrangement, visiting the Philippines to meet with vendors two or three times a year. Workers there were paid far less.
“People were not OK,” he said. “They were having the same issues I was having.”
He encouraged the vendors to establish an employee wellness program and spoke with the workers about the importance of self-care, the need to take breaks, to look up from their computer screen.
In 2014, Soto reached his breaking point. His team discovered a video recording of a young girl being raped and murdered. He wanted to kill the man, to make him suffer. He began experiencing auditory hallucinations, according to court papers.
Soto asked for a transfer. He had to apply for another job, the lawsuit says.
He went into a different position, but it wasn’t a clean break. His new work area was close enough to the Online Safety team that he was regularly asked for advice.
The new job didn’t work out. He began receiving negative performance reviews.
In February 2015, Soto’s psychiatrists suggested he take a medical leave from Microsoft. He hasn’t returned.
The state Department of Labor and Industries later that year denied his unemployment claim, in part because it found “occupational disease based on mental conditions or mental disabilities caused by stress are specifically excluded from coverage by law,” according to court papers.
The lawsuit argues that Microsoft remains responsible for its workers’ well being.
The Washington State Patrol’s Missing and Exploited Children Task Force has detectives who engage in similar computer forensics work to build cases against child pornographers. Team members are provided employee assistance and peer support programs available to all troopers, said Lt. Michael Eggleston of the special investigations section.
They also have access to a mental health professional who formerly investigated similar cases. Under a contract, the adviser talks with workers and suggests steps managers can take to keep people healthy, such as regular debriefings, requiring people to take breaks and giving them down time to decompress before heading out for vacation, Eggleston said.
The program was implemented a little over a year ago and most task force members have participated.
“I recognized we were going to have burnout, we were going to have fade,” Eggleston said.
Elana Newman is a professor of psychology at the University of Tulsa in Oklahoma and an expert on PTSD. Much of her work focuses on the occupational health of journalists and others regularly exposed to traumatic events.
She points to a new study detailing how media workers and those from human rights and humanitarian organizations are affected by repeated exposure to eyewitness video and images documenting traumatic events. Such “user-generated content” can include natural disasters, battlefield scenes, human rights abuses and terror executions. Tech has created a “digital frontline,” the study’s authors suggest.
“This is becoming, I think, a larger and larger issue,” Newman said.
Sara Soto said her husband’s wounds impact every aspect of their life, and with his limitations, she often feels like a single parent.
She misses the man she married, a gentle guy she’s known since high school, who loved going to movies and abandoned dreams of one day becoming a chef because he wanted a career that would better support a family.
Henry Soto hopes that by coming forward he can help others stay healthy and continue to do critical work.
It took him a long time to develop expertise in tracking down digital predators, he said. Being forced to step away brings its own torment.
“You feel guilt about stopping,” he said.
Scott North: 425-339-3431; firstname.lastname@example.org. Twitter: @snorthnews.
Here is Microsoft’s statement about a civil lawsuit brought by former employees in its Online Safety team:
“We disagree with the plaintiffs’ claims. Microsoft takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work.
“Microsoft applies industry-leading technology to help detect and classify illegal imagery of child abuse and exploitation that are shared by users on Microsoft Services. Once verified by a specially trained employee, the company removes the imagery, reports it to the National Center for Missing & Exploited Children, and bans the users who shared the imagery from our services.
“This work is difficult, but critically important to a safer and more trusted internet. The health and safety of our employees who do this difficult work is a top priority. Microsoft works with the input of our employees, mental health professionals, and the latest research on robust wellness and resilience programs to ensure those who handle this material have the resources and support they need, including an individual wellness plan. We view it as a process, always learning and applying the newest research about what we can do to help support our employees even more.”