Alex Krasodomski

Monitoring social media is easier said than done

The three British girls who packed their bags and took a flight to Turkey have apparently crossed the border into Syria. Their intention seems to be to join the Islamic State and it looks like they may have succeeded. It emerged over the weekend that there had been contact between one of the girls and Aqsa Mahmood, a Scottish woman who travelled to Syria herself. Initially communicating through Twitter, it appears Mahmood played a role in their journey to Turkey and now into the heart of the conflict in Syria. Criticism turned on the security services: according to Aamer Anwar, the lawyer for the family of Aqsa Mahmood, they are not even doing the ‘basics’ to prevent this kind of migration. The ‘basics’ seem to include social media monitoring. In an interview with the BBC, Anwar described his ‘incredulity’ that messages seemingly forewarning the girls’ actions weren’t acted upon. This mirrors the report produced by the Intelligence and Security Committee of Parliament following the inquest into the murder of Lee Rigby. Writing in the Telegraph, Sir Malcolm Rifkind slams Facebook for failing ‘to notify the authorities when their systems appear to be used by terrorists’. There existed a private Facebook message sent by one of the killers around six months before the murder in which Adebowale ‘expressed his intent to murder a soldier in the most graphic and emotive manner’. The notion is that the security services, or even the social media companies themselves, ought to be catching these messages and acting on them. The problem is these ‘basics’ aren’t that simple. One of the key problems is the sheer volume of data. Conservative estimates place the total messages sent per day across Facebook’s systems (‘Messenger’ and Whatsapp) at a staggering fifty billion, or just over half a million messages a second. On Facebook alone. Moderating this manually is, of course, laughable. The only reason I mention it is because much of Facebook’s visual content is in fact administered manually. The bleak reality of how this takes place is brought to life in this Wired article. It is not a system one could begin to defend in the interests of national security. Therefore the process must be automated. A system should be built, so it goes, that automatically checks the message to see if it constitutes a terror threat. But how? A ‘keyword match’ is hopelessly clumsy and entirely unfit for purpose. A google search for ‘jihad’ throws up forty-million English-language sites. The answer probably lies in ‘Natural Language Processing’, or NLP, a technology that can help computers process vast amounts of text very quickly and try to guess at its meaning. But processing language is one of the most difficult things we can ask a computer to do, and classifiers are never perfect, their quality diminishing as the way we talk shifts and changes. It is, at this stage, a technological bridge too far, even before a discussion of whether it is ethical or whether it violates a fundamental right to privacy. It is reasonable to expect security services to have the power and right to intercept suspects’ communications, just as we would expect them to try to intercept suspects’ phonecalls. It is not reasonable to expect them (or Californian social media companies) to monitor social media in order to pre-emptively spot signs that someone is planning an act of terrorism. Both Michael Adebolajo and Michael Adebowale were known to police and MI5 for years before the murder of Drummer Rigby.  Once suspected, it is reasonable that something be put in place to monitor their communications. Generally speaking, Facebook and Twitter are forthcoming with requests by UK and US security services, and will provide information on suspects. But this is only possible once a suspect has been identified. Something doesn’t seem right about the ease with which the three girls made it to Turkey and across the border. But if we must look to blame anyone, we ought to begin with the extremists in Syria and elsewhere who deluded these children into thinking that Isis offered them a chance at adventure, romance or utopia.

Alex Krasodomski is a researcher at the Centre for the Analysis of Social Media at Demos. He can be found tweeting @akrasodomski

You might also enjoy reading:

Comments