Facebook says spam filter mayhem not related to coronavirus | Technology | The Guardian

Facebook said this week it would be sending all of its contracted human moderators home. The company cannot offer remote working for its moderation staff owing to privacy considerations over the material they handle, and so its moderation work will be done exclusively by permanent employees for the foreseeable future.

Facebook says the absence of human moderators was not related to the spam filter error and it believes it is well prepared for moderating the site with a vastly reduced human workforce.

Kang-Xing Jin, Facebook’s head of health, said: “We believe the investments we’ve made over the past three years have prepared us for this situation. With fewer people available for human review, we’ll continue to prioritise imminent harm and increase our reliance on proactive detection in other areas to remove violating content. We don’t expect this to impact people using our platform in any noticeable way.”

Facebook is not the only technology firm to have sent home its moderators. YouTube announced on Monday that it would be relying more on AI to moderate videos in the future. Unlike Facebook, the video site did not commit to the change being invisible to users. Instead, it said more videos would be taken down as a result of the lack of human oversight.

Normally, YouTube videos are flagged by an AI and then sent to a human reviewer to confirm they should be taken down. But now videos will far more frequently be removed on the say of an AI alone. The company says it will not be giving creators a permanent black mark, or “strike”, if their videos are taken down without human review, since it accepts that it will inevitably end up taking down “some videos that may not violate policies”.

Source: Facebook says spam filter mayhem not related to coronavirus | Technology | The Guardian