ANOTHER week, and another story that dominated tech headlines around the world – the leaking of internal documents revealing how Facebook is coping – or struggling to cope – with the scale of adjusting to moderating content.
As reported in a Guardian investigation, and then flashed around the world, Facebook is engaged in a rolling battle against a ceaseless tide of inappropriate content, while at the same time its moderation staff find some of its rules confusing, and the task challenging.
Removing revenge porn and sexual content are a growing priority, but the moderators face a huge range of content deemed as violent, aggressive, sexualised or otherwise inappropriate – except that it’s not always clear that the content is indeed ‘wrong’.
For example, problems arise with classifying several types of content, with art and historical content often crossing the line into ‘inappropriate’ territory – perhaps best demonstrated by Facebook briefly censoring late last year one of the most iconic photos of the 20th Century, that of a naked girl Kim Phuc (9) running away from a napalm attack during The Vietnam War.
That photo was swiftly reinstated after an international outcry, with Facebook shortly afterwards revealing that it would begin to monitor content both more closely, and more sensitively.
However, according to the Guardian, the scale and nature of content to check is staggering – with the reports showing almost 54,000 potential cases of revenge pornography and “sextortion” were dealt with in a single month alone – and that’s just reported content.
In January, Facebook disabled more than 14,000 accounts related to these types of grotesque sexual abuse, with 33 cases relating to children.
With its absolutely enormous user base, there’s no way for the company (or any other huge company or site) to actively track, monitor or moderate content, with it largely relying on user-reported content to act.
However, the leaked report gives an insight into the scale of problems Facebook now faces as it adjusts to its growing role as a digital and news content provider.
It’s a story that we’re all likely to read more about, as Facebook, and other tech titans, find themselves increasingly fighting malicious content that no company would endorse. Facebook’s battles represent a content war that looks set to rage on.