Skip to Content

Features

The memory gap: how technology took over the mind

What happens when we outsource part of our brains to the internet?

13 August 2016

9:00 AM

13 August 2016

9:00 AM


The Spectator podcast: Listen to Isabel Hardman, Lara PrendergastCharlotte Jee, Editor of Techworld, and Professor Martin Conway, head of psychology at City University discuss the memory gap.


Ask me what I had for lunch yesterday and I couldn’t tell you. Names disappear as swiftly as smoke. Birthdays, capital cities, phone numbers — the types of facts that used to come so readily — are no longer forthcoming. I’m 26, yet I feel I have the memory of a 70-year-old. My brain is a port through which details pass, but don’t stay.

I’m not alone. Many young people feel our memories have been shot to pieces. It’s the embarrassing secret of my generation. We can hardly recall a thing. We joke about having early onset Alzheimer’s, often with a hint of real anxiety. We know that when we reach to remember any detail — a route, a phrase, a historical fact — our minds do not perform at the critical moment. So we reach instead for our phones, which are much more trustworthy. We do so as naturally as we might scratch an itch. How to get from A to B. How to make risotto. How to write a magazine article. Can’t spell a word? No bother — tap a guess on to your screen and Google will figure it out.

Every day, and increasingly in every way, we are outsourcing our brains to the internet. But at what cost? As smartphones get smarter, it’s easy to argue that we’re getting thicker. That’s not quite true. Our brains are not necessarily shrivelling; they are adapting. Thanks to technology, the need to know has been replaced by the ability to find out. Younger people, especially the ‘digital natives’ who have never known life without the web, are most comfortable in this new environment. In almost every profession, expertise is being made redundant. In my profession, journalism, the sharp kids with fresh IT skills can claim to have the edge over the seasoned hacks (or at least that’s what they want our bosses to think). Or look at the taxi industry, which has been revolutionised by apps such as Uber. London’s black cabbies have discovered that ‘the Knowledge’ — that impressively encyclopaedic study of the capital’s streets — has become all but-obsolete. With Uber, anyone who can drive and has a smartphone can earn money as a taxi driver.

Even in high-pressure fields such as medicine, politics or law, the speed with which information can be found means that professionals rely on technology as much as everyone else. Some doctors freely admit to searching Google for symptoms as their patients describe them. Lawyers no longer need to remember the intricacies of tort law — at least not parrot fashion.

What does that mean for education? University finals, where students rely on memory unaided, seem an anachronism. Once, it was the perfect training for later life: learn a subject, store the information, use it later to your career advantage. But when every fact is just a click or two away, what’s the point?


I can at least still remember what it used to be like to commit a fact to memory. You could take pride in it. It was a delicious, joyful thing, a gentle high. That has now been replaced by addictive short, sharp hits of dopamine mixed with adrenaline. Who can search first? Who has the fastest fingers tap tapping away on their phone? The pleasure of contemplation has been replaced by the constant buzz of ephemera passing us by: on Instagram, on Facebook, on crack-like news apps. Even language is often bypassed; we increasingly communicate via images to save time. Forget a thousand words; send an emoji. Or a picture via Snapchat that will self destruct after a few seconds.

Neurologists talk about the ‘plasticity’ of the brain — its ability to adapt its function according to which neural pathways are most employed — and there is evidence to suggest that our brains are changing to meet the demands of this high-octane modern world. It’s reactionary to assume that this is bad news: the idea that technology is ruining our ability to think and communicate properly is as old as technology itself. People blamed the telegram for curtailing speech. Radio was thought to be dangerously mindless, and everybody has always said that television rots the brain. But, for all these obstacles, humanity has just become more ingenious, so much so that we invented the internet, a medium for being clever without using our intelligence.

If our brains are changing to the new digital environment, maybe we should feel encouraged by our resourcefulness. Perhaps memory is something we can afford to sideline, and instead we can focus on skimming off facts and figures while relying on our short-term memory. Ensuring that knowledge is actually remembered requires time and concentration. And in this world of instant notifications and non-stop info, speed is king. Why bother learning ten things when your phone can find out any one of a million things in a few seconds?

The answer is that the brain requires exercise, and we allow it to atrophy at our peril. While we get better at juggling ideas, our memories are taking a battering. An academic study into the ‘Google effect’ showed that people tend not to bother remembering something if they believe it can be looked up later. People were more likely to index; to remember where information was located rather than the actual information itself. That study was five years ago and technology has moved on significantly. Want to bet that people’s memories have got better — or worse — since then? Last year, 91 per cent of people surveyed for another study into ‘digital amnesia’ said they used the internet as an ‘online extension’ of their brain and 44 per cent relied on their smartphone. Of 6,000 adults surveyed across Europe, more than a third turned to computers to help recall information. The UK had one of the worst rates: more than half of British adults admitted that they don’t even try to remember answers, they just search online. We are becoming a flabby-brained nation.

Techno-libertarians rejoice at the idea of computers becoming integral to the human experience. The big nerdy fantasy is that we’ll all become hybrids — part human, part computer. And with the advent of wearable tech — Apple watches, Fitbits and so on — that process seems well under way. But if we cease to be fully human, life must become something less. Already, young people depend on technology for peace of mind. When our phone batteries run out, we feel a deep anxiety, not because we desperately want to read our emails, but because our gizmos are now part of who we are.

Worse still, the machines have an insatiable appetite for more information, which must come at our expense. My phone keeps telling me it has run out of memory and that I must buy more. Then there is the ‘cloud’, a separate memory bank where the phone also suggests I store things. I assume this cloud brain floats somewhere above California. Keeping it full of my information could become a very expensive habit. But the thought of losing that stuff is terrifying. I would (and do) pay handsomely to insure against it.

Suppose my cloud and all other clouds vanished, though. What if, instead of a nuclear strike or tsunami, an electromagnetic pulse wiped every hard drive, every detail of every bank account, every family photo? Then what? History is littered with examples of knowledge being destroyed or damaged. Details about ancient Roman sanitation were lost for hundreds of years during the medieval period; the destruction of the Great Library of Alexandria — one of the ancient world’s great archives of knowledge — should still serve as a warning for us. We assume that nothing is ever lost online, but that’s not true. A computer science study from 2012 showed that almost a third of recorded history shared over social media during the ‘Arab Spring’ uprising in Egypt has since been deleted.

If everything is lost in the digital ether and nobody has bothered to remember anything, then what? The Long Now Foundation hopes to become a ‘long-term cultural institution’ to counter the fact that ‘civilisation is revving itself into a pathologically short attention span’. Its grandiose plan — to help archive digital material in a responsible way for the next 10,000 years — sounds whimsical, but the idea behind it opens up an interesting discussion: how do we preserve our experiences so that when future civilisations look back at us, they don’t just see another dark age?

The problem of digital amnesia is more immediate. John Locke thought that memory and our sense of self were inevitably linked, because personal identity was founded on consciousness. (At least that’s what his Wikipedia page says.) Surely that’s right. When memories fail in old age, we feel we lose a part of us that rests deep within. That is why Alzheimer’s, which afflicted both my paternal grandparents, is such a cruel disease. It may well be that memory is more spiritual than we like to admit. By using our minds, we nourish a part of us that goes beyond the physical. Equally, by storing memory outside of ourselves on a piece of technology, we lose something fundamental.

Ted Hughes recommended memorising poetry not just for its own sake but as a form of exercise — for mind and soul (thanks again, Wiki). My plan now is to try to do just that: remember a few more things each day, rely less on my smartphone and have a go at learning the odd poem or two. Reflection, I hope, can cure this modern affliction.


Listen to Lara Prendergast narrate this piece:


 

Subscribe to The Spectator today for a quality of argument not found in any other publication. Get more Spectator for less – just £12 for 12 issues.


Show comments
Close