Human Content Filters Sue Microsoft for PTSD

Post traumatic stress disorder (PTSD) is a mental disorder caused by exposure to very stressful, frightening or distressing events. It is usually associated with soldiers or the survivors of disasters, but Microsoft is being sued by two employees who were tasked with reviewing graphic content as part of Microsoft’s online safety program. The Courthouse News Service reports that Microsoft did not warn them about the psychological dangers of their job, and provided inadequate support when their symptoms became apparent.

The purpose of Microsoft’s online safety program is to decide whether content should be removed from the web or reported to law enforcement. Programs like these have been increasingly adopted worldwide during the last decade, not least in response to public pressure to censor disturbing content found on the web. But whilst it is easy to demand that somebody should ‘do something’ about obscene material, there is no magic solution. What one person finds intolerable might also be legal, and businesses who engage in filtering content must consider their obligation not to infringe upon free speech, nor to impede legal business activities. Most societies draw a line somewhere, but it requires the subjective judgement of a human being to determine if the line has been crossed in a specific instance. Clearly that poses a risk to the individual who performs that task, who must survey graphic content so the rest of society can be spared from doing so.

The complaint filed by plaintiffs Henry Soto and Greg Blauert asserts that they…

…were entrusted with a high degree of responsibility. In 2008, Mr. Soto and others had “God like” status and could literally view any customer’s communications at any time… Throughout their careers at Microsoft, both plaintiffs were instrumental in saving children’s lives and providing evidence for successful prosecutions.

However, they argue that important employee safety protections were not implemented due to budget constraints. Soto alleges Microsoft involuntarily transferred him to the online safety team, and he was initially told he would be reviewing violations of the “terms of use” between Microsoft and its customers. He did not understand that he would be working on activities like…

…assisting law enforcement efforts to break up significant crime rings, the mob, the triad, and other violent groups, reviewing photos and video requiring him to witness horrible brutality, murder, indescribable sexual assaults, videos of humans dying and, in general, videos and photographs designed to entertain the most twisted and sick minded people in the world.

Soto’s reported symptoms are:

…panic attacks in public, disassociation, depression, visual hallucinations, and an inability to be around computers or young children, including, at times, his own son, because it would trigger memories of horribly violent acts against children that he had witnessed.

Despite the severity of these claims, an insurance claim for compensation was denied for a number of reasons, including:

“The worker’s condition is not an occupational disease… and is excluded from coverage.”

I am not in a position to judge the veracity of the complaint, nor the care shown by Microsoft to its employees. What I can say is that my experience shows many seemingly sophisticated businesspeople adopt a profoundly naive attitude to tasks like those performed by Henry Soto and Greg Blauert. People want the problem of online filth to go away, so they prefer not to think about it. However, there are myriad responsibilities, to the public, to victims, and to employees too. The risks are profound, whilst many decisions are far from easy. The topic of filtering online content demands clear and courageous thinking about all the consequences, for everybody involved. As unpleasant as the material is, this is not an area where business leaders can shy away from their responsibilities.

Eric Priezkalns
Eric Priezkalns
Eric is the Editor of Commsrisk. Look here for more about the history of Commsrisk and the role played by Eric.

Eric is also the Chief Executive of the Risk & Assurance Group (RAG), a global association of professionals working in risk management and business assurance for communications providers.

Previously Eric was Director of Risk Management for Qatar Telecom and he has worked with Cable & Wireless, T‑Mobile, Sky, Worldcom and other telcos. He was lead author of Revenue Assurance: Expert Opinions for Communications Providers, published by CRC Press. He is a qualified chartered accountant, with degrees in information systems, and in mathematics and philosophy.