A former moderator for Facebook claims he has suffered “serious psychological injuries” as a consequence of being exposed to explicit content during his work for the social media giant.
Sean Burke has lodged a High Court action seeking damages against Facebook Ireland and CPL Solutions, the Dublin outsourcing company. He alleges that he was required to “repeatedly view extremely violent, graphic and upsetting material, some of which involved children” as part of his work.
He claims he suffered “serious psychological injuries” as a result of his exposure to the material.
In September, the Personal Injuries Board gave a group of former moderators in Ireland the go-ahead to serve proceedings against Facebook in the High Court. Under section 17 of the Personal Injuries Assessment Board Act 2003, if a plaintiff’s injury consists of psychological damage that would be difficult to assess by the board, it can give permission for the claim to be pursued through the courts.
According to the new legal filings, Burke says he was recruited through CPL to work at Facebook’s offices at the Beckett Building in East Wall in Dublin as a content moderator in November 2017 on a rate of €12.98 an hour. This rose to €13.85 an hour in July 2018.
It is Burke’s claim that he did not undergo, and was not subject to, any form of screening or health check by Facebook prior to commencing employment. He took part in eight days’ training, made up of classroom lectures that he claims were “wholly inadequate in all the circumstances”.
He alleges that on November 2, when he spoke on the telephone with CPL having been informed that he had been successful at interview, he was advised in an “off-the-cuff and informal manner” that “there is a chance, it doesn’t happen that often, that you may be reviewing disturbing content”.
Burke claims that he never expected or was informed that he would be required to view material of an extreme nature and the quantities of such material. He said that counselling services provided through Facebook “were entirely insufficient” and inadequate.
After his initial training he says he began to work on the night shift from 6pm until 2am, independently reviewing content for compliance with Facebook’s policies as part of a group of around 11 people which later grew to 25, according to his claim.
Initially, he says, he did well, but by around February 2018 the combination of allegedly “extreme content, oppressive targets and insufficient time off” had begun to take its toll on him.
Burke alleges that content he had to review included: videos showing the rape and/or sexual assault of children, a compilation of clips showing people dying by suicide, set to music; a collection of hundreds of photos depicting people self harming; a video of a man being beaten to death with planks of wood; videos and images of beheadings; videos showing people being electrocuted and impaled and videos showing individuals being stabbed in the stomach.
Burke claims he attended his own GP and Facebook’s health coaching team many times and spoke with counsellors. He says that he explained that he was experiencing nightmares, anxiety and other symptoms. Despite this, he alleges no steps were taken to assist him. He says he asked to be put on daytime shifts, but was told none were available.
Burke says he became “increasingly numb and desensitised to the horrific content with which he was faced”.
He is claiming negligence and breach of duty by Facebook and CPL, alleging they failed to have reasonable regard to his safety, or to take reasonable care for his physical and mental health and wellbeing, and exposed him to risk, danger and/or injury.
He claims they did not ensure that he was not exposed to excessively graphic and disturbing content and failed to carry out adequate assessment of whether he was capable of coping with the impact content moderation work had on him.
He also claims they did not provide adequate psychological support on a regular basis to deal with the mental and emotional strain. He is seeking damages for loss of earnings, medical expenses and other expenses.
When contacted on Friday, a spokesman for CPL said it had no comment. A representative for Facebook said it did not comment on ongoing cases.
- Facebook has previously said it was committed to providing support for content reviewers and recognised that reviewing certain content can be difficult. It has also previously said that all those who review content go through an in-depth training programme on community standards and have “access to extensive psychological support”. The firm has introduced technical solutions to help moderators, such as tools to blur graphic images.