While everyone is rightly calling for the Post Office Horizon postmasters’ convictions to be overturned and for them to be given proper compensation, there is arguably a deeper issue at stake: how much society trusts technology.
We now know that the Horizon accounting software was not reliable. But during the sub-postmasters’ original trials, prosecutors for the Post Office argued that it was dependable and the courts took this for granted.
Why did the courts rely on the evidence of such a faulty IT system? The answer is a society-wide problem – with people falling into the trap of thinking computers are infallible. You might think they are no more likely to lie to you than an abacus. But that ignores the fact that computers are programmed by humans, who are just as likely to be as incompetent or lazy as anyone else. This can lead to some terrifying scenarios.
Take the health sector. Currently, when an anaesthetist presses a button to put you to sleep, by law they have to be competent and will have up to date qualifications backed by years of training. Yet what happens when they press that button is anyone’s guess, because a computer will do it, and we have little idea whether the computer is reliable or who programmed the underlying software.
Remarkably, there are no regulations governing the qualifications of people programming any system, whether that’s for accounting (as in Horizon) or delivering anaesthetics. This means we are unable to distinguish between safe and unsafe computer systems. Computers have advanced faster than the law and society’s understanding, and it is now leading to problems, as Horizon clearly shows. Problems then escalate into prosecutions.
But once you get as far as the courts it gets worse.
English and Welsh common law has a presumption that computer evidence is reliable. The common law presumption makes a little bit of sense because a court can’t be expected to understand technical arguments about computer programming or bugs one way or the other. And if computer evidence wasn’t assumed to be reliable, we would all be trying to argue against speeding fines, unpaid parking tickets, bank fees, and much more.
The problem is that the law papers over some terrifyingly large cracks. In reality, we have no idea whether any computer evidence is reliable, because the people who built the computer systems that produce the evidence may not have been competent. Their programs could easily make a mess of any evidence. And the court simply can’t tell if the software they’re relying on is wonderful, or terrible as Fujitsu’s Horizon certainly was. Indeed, the manufacturers who make these systems can’t always tell either; they don’t know how to recruit and manage good programmers, and so they unwittingly employ quacks.
It gets worse. Many computer quacks don’t even realise they are quacks. They know so little about good programming, they don’t realise they are not good programmers. It’s called unconscious incompetence – and it will not stop ignorant ‘experts’ giving witness statements in court.
Luckily, there is a way to fix this sorry mess. Centuries ago, quack doctors were a danger to society and so the government responded by passing the Medical Act of 1858 because, in the Act’s opening words, ‘it is expedient that Persons requiring Medical Aid should be enabled to distinguish qualified from unqualified Practitioners.’ We now think the idea of registering qualified doctors is self-evident.
It is time the government legislated so that everyone can avoid being the victims of quack computer systems. A computer version based on the uncontroversial Medical Act would require programmers to be registered, have a decent education, and have respectable, relevant qualifications. Of course, this wouldn’t apply to hobbyists and children programming. But if you wanted to program a serious system then you would have to be registered as competent.
Realistically, such qualifications and any legal requirements will be a long time coming. In the meantime, computer systems should be audited and given a safety and reliability rating. This would at least reduce the risk of people – such as innocent postmasters – being found guilty of crimes they did not commit. This is the least we can do to make sure the Horizon scandal is never repeated again.
Comments