In the 21st century, digital technology has changed many aspects of our lives. Generative artificial intelligence (AI) is the latest newcomer, with chatbots and other AI tools changing and creating considerable regarding what it means to “outsource thinking”.
But the emergence of technology that changes the way we live is not a new issue. The change from analogue to digital technology began around the 1960s and this “” is what brought us the internet. An entire generation of people who lived and worked through this evolution are now entering their early 80s.
So what can we learn from them about the impact of technology on the ageing brain? A comprehensive from researchers at the University of Texas and Baylor University in the United States provides important answers.
Published today in Nature Human Behaviour, it found no supporting evidence for the “digital dementia” hypothesis. In fact, it found the use of computers, smartphones and the internet among people over 50 might actually be associated with lower rates of cognitive decline.
What is ‘digital dementia’?
Much has been written about the potential .
According to the introduced by German neuroscientist and psychiatrist in 2012, increased use of digital devices has resulted in an over-reliance on technology. In turn, this has weakened our overall cognitive ability.
Three areas of concern regarding the use of technology have previously been noted:
An increase in . This refers to technology use which does not require significant thought or participation, such as watching TV or scrolling social media.
to technology, such as no longer memorising phone numbers because they are kept in our contact list.
Increased .
Why is this new study important?
We know technology can impact how our brain . But the effect of technology on how our brain ages is less understood.
This new study by Jared Benge and Michael Scullin is important because it examines the impact of technology on older people who have experienced significant changes in the way they use technology across their life.
The new study performed what is known as a where the results of many previous studies are combined. The authors searched for studies examining technology use in people aged over 50 and examined the association with cognitive decline or dementia. They found 57 studies which included data from more than 411,000 adults. The included studies measured cognitive decline based on lower performance on cognitive tests or a diagnosis of dementia.
A reduced risk of cognitive decline
Overall, the study found greater use of technology was associated with a reduced risk of cognitive decline. were used to determine the “odds” of having cognitive decline based on exposure to technology. An odds ratio under 1 indicates a reduced risk from exposure and the combined odds ratio in this study was 0.42. This means higher use of technology was associated with a 58% risk reduction for cognitive decline.
This benefit was found even when the effect of known to contribute to cognitive decline, such as socioeconomic status and other health factors, were accounted for.
Interestingly, the magnitude of the effect of technology use on brain function found in this study was similar or stronger than other known protective factors, such as physical activity (approximately a 35% risk reduction), or maintaining a healthy blood pressure (approximately a 13% risk reduction).
However, it is important to understand that there are conducted over many years examining the benefits of managing and increasing , and the mechanisms through which they help protect our brains are far more understood.
It is also a lot easier to measure blood pressure than it is use of technology. A strength of this study is that it considered these difficulties by focusing on certain aspects of technology use but excluded others such as brain training games.
These findings are encouraging. But we still can’t say technology use causes better cognitive function. More research is needed to see if these findings are replicated in different groups of people (especially those from ) who were underrepresented in this study, and to understand why this relationship might occur.
A question of ‘how’ we use technology
In reality, it’s simply not feasible to live in the world today without using some form of technology. Everything from paying bills to booking our next holiday is now almost completely done online. Maybe we should instead be thinking about how we use technology.
such as reading, learning a new language and playing music – particularly in early adulthood – can help protect our brains as we age.
Greater engagement with technology across our lifespan may be a form of stimulating our memory and thinking, as we adapt to new software updates or learn how to use a new smartphone. It has been suggested this “” may be good for our brains.
Technology may also help us to stay , and help us stay .
A rapidly changing digital world
While findings from this study show it’s unlikely all digital technology is bad for us, the way we interact and rely on it is rapidly changing
The impact of AI on the ageing brain will only become evident in future decades. However, our ability to adapt to historical technological innovations, and the potential for this to support cognitive function, suggests the future may not be all bad.
For example, advances in offer new hope for those experiencing the impact of neurological disease or disability.
However, the potential downsides of technology are real, particularly for younger people, including . Future research will help determine how we can capture the benefits of technology while limiting the potential for harm.
, Postdoctoral Research Fellow, Neuroscience Research Australia (NeuRA),
This article is republished from under a Creative Commons license. Read the .