Do you remember long division? I do, vaguely – I certainly remember mastering it at school: that weird little maths shelter you built, with numbers cowering inside like fairytale children, and a wolf-number at the door, trying to eat them (I had quite a vivid imagination as a child). Then came the carnage as the wolf got in – but also a sweet satisfaction at the end. The answer! You’d completed the task with nothing but your brain, a pen, and a scrap of paper. You’d thought your way through it. You’d done something, mentally. You were a clever boy.
I suspect 80 to 90 per cent of universities will close within the next ten years
Could I do long division now? Honestly, I doubt it. I’ve lost the knack. But it doesn’t matter, because decades ago we outsourced and off-brained that job to machines – pocket calculators – and now virtually every human on earth carries a calculator in their pocket, via their phones. Consequently, we’ve all become slightly dumber, certainly less skilled, because the machines are doing all the skilful work of boring mathematics.
Long division is, of course, just one example. The same has happened to spelling, navigation, translation, even the choosing of music. Slowly, silently, frog-boilingly, we are ceding whole provinces of our minds to the machine. What’s more, if a new academic study is right, this is about to get scarily and dramatically worse (if it isn’t already worsening), as the latest AI models – from clever Claude Opus 4 to genius Gemini 2.5 Pro – supersede us in all cerebral departments.
The recent study was done by the MIT Media Lab. The boffins in Boston apparently strapped EEG caps to a group of students and set them a task: write short essays, some using their own brains, some using Google, and some with ChatGPT. The researchers then watched what happened to their neural activity.
The results were quite shocking, though not entirely surprising: the more artificial intelligence you used, the more your actual intelligence sat down for a cuppa. Those who used no tools at all lit up the EEG: they were thinking. Those using Google sparkled somewhat less. And those relying on ChatGPT? Their brains dimmed and flickered like a guttering candle in a draughty church.
It gets worse still. The ChatGPT group not only produced the dullest prose – safe, oddly samey, you know the score – but they couldn’t even remember what they’d written. When asked to recall their essays minutes later, 78 per cent failed.
Most depressingly of all, when you took ChatGPT away, their brain activity stayed low, like a child sulking after losing its iPad. The study calls this ‘cognitive offloading’, which sounds sensible and practical, like a power station with a backup. What it really means is: the more you let the machine think for you, the harder it becomes to think at all.
And this ain’t just theory. The dulling of the mind, the lessening need for us to learn and think, is already playing out in higher education. New York Magazine’s Intelligencer recently spoke to students from Columbia, Stanford, and other colleges who now routinely offload their essays and assignments to ChatGPT.
They do this because professors can no longer reliably detect AI-generated work; detection tools fail to spot the fakes most of the time. One professor is quoted thus: ‘massive numbers of students are going to emerge from university with degrees, and into the workforce, who are essentially illiterate.’
In the UK the situation’s no better. A recent Guardian investigation revealed nearly 7,000 confirmed cases of AI-assisted cheating across British universities last year – more than double the previous year, and that’s just the ones who got caught. One student admitted submitting an entire philosophy dissertation written by ChatGPT, then defending it in a viva without having read it. The result? Degrees are becoming meaningless, and the students themselves – bright, ambitious, intrinsically capable – are leaving education maybe less able than when they entered.
The inevitable endpoint of all this, for universities, is not good. Indeed, it’s terminal. Who is going to take on £80k of debt to spend three years asking AI to write essays that are then marked by overworked tutors using AI – so that no actual human does, or learns, anything? Who, in particular, is going to do this when AI means there aren’t many jobs at the end, anyhow?
I suspect 80 to 90 per cent of universities will close within the next ten years. The oldest and poshest might survive as finishing schools – expensive playgrounds where rich kids network and get laid. But almost no one will bother with that funny old ‘education’ thing – the way most people today don’t bother to learn the viola, or Serbo-Croat, or Antarctic kayaking.
Beyond education, the outlook is nearly as bad – and I very much include myself in that: my job, my profession, the writer. Here’s a concrete example. Last week I was in the Faroe Islands, at a notorious ‘beauty spot’ called Trælanípa – the ‘slave cliff’. It’s a mighty rocky precipice at the southern end of a frigid lake, where it meets the sea. The cliff is so-called because this is the place where Vikings ritually hurled unwanted slaves to their grisly deaths.
Appalled and fascinated, I realised I didn’t know much about slavery in Viking societies. It’s been largely romanticised away, as we idealise the noble, wandering Norsemen with their rugged individualism. Knowing they had slaves to wash their undercrackers rather spoils the myth.
So I asked Claude Opus 4 to write me a 10,000-word essay on ‘the history, culture and impact of slavery in Viking society.’ The result – five minutes later – was not far short of gobsmacking. Claude chose an elegant title (‘Chains of the North Wind’), then launched into a stylish, detailed, citation-rich essay. If I had stumbled on it in a library or online, I would have presumed it was the product of a top professional historian, in full command of the facts, taking a week or two to write.
But it was written by AI. In about the time it will take you to read this piece. This means most historians are doomed (like most writers). This means no one will bother learning history in order to write history. This means we all get dumber, just as the boffins in Boston are predicting.
I’d love to end on a happy note. But I’m sorry, I’m now so dim I can’t think of one. So instead, I’m going to get ChatGPT to fact-check this article – as I head to the pub.
Comments