Digital technology has altered many facets of our existence in the twenty-first century. The newest technology is generative artificial intelligence (AI), which is transforming how we learn and posing significant legal and philosophical questions about what it means to “outsource thinking” through chatbots and other AI technologies.
However, the advent of technology that alters our way of life is not a recent problem. Around the 1960s, analog technology started to give way to digital, and it was this “digital revolution” that gave rise to the internet. People in their early 80s are part of a generation that experienced this transformation firsthand.
What can we infer from them, then, about how technology affects the aging brain? Important answers are provided by a thorough new study conducted by experts at Baylor University and the University of Texas in the United States.
The study, which was published today in Nature Human Behaviour, found no evidence to support the “digital dementia” theory. In fact, it discovered that older adults who use computers, smartphones, and the internet may actually have reduced rates of cognitive deterioration.
“Digital dementia”: what is it?
Much has been written about the potential negative impact from technology on the human brain.
Manfred Spitzer, a German neuroscientist and psychiatrist, proposed the “digital dementia” theory in 2012, which holds that an excessive dependence on technology has been caused by the growing usage of digital devices. As a result, our general cognitive capacity has been compromised.
Prior research has identified three areas of concern related to technology use: more time spent passively using screens. This includes using technology for activities like watching TV or browsing social media that do not call for a lot of attention or involvement.
transferring cognitive functions to technology, such as the ability to remember phone numbers since they are stored in our contact list.
more proneness to distraction.
Why is this new study important?
We know technology can impact how our brain develops. But the effect of technology on how our brain ages is less understood.
This new study by neuropsychologists Jared Benge and Michael Scullin is important because it examines the impact of technology on older people who have experienced significant changes in the way they use technology across their life.
The current study integrated the findings of numerous other investigations in what is called a meta-analysis. The authors looked for research on technology use in those over 50 and investigated any connections to dementia or cognitive decline. They discovered 57 research that contained information from almost 411,000 adults. Reduced performance on cognitive tests or a dementia diagnosis were the basis for measuring cognitive decline in the included studies.
A reduced risk of cognitive decline
Overall, the study discovered that a lower risk of cognitive impairment was linked to increased technology use. The “odds” of cognitive decline based on technology exposure were calculated using statistical testing. The overall odds ratio in this study was 0.42, and an odds ratio less than 1 denotes a lower risk from exposure. This indicates that a 58% lower risk of cognitive deterioration was linked to increased technology use.
Even after controlling for other characteristics that are known to contribute to cognitive decline, like socioeconomic status and other health conditions, this benefit was still observed.
It is interesting to note that the study’s findings about the impact of technology use on brain function were comparable to or even greater than those of other established protective factors, like physical activity (which reduces risk by about 35%) or keeping blood pressure in check (which reduces risk by about 13%).
The advantages of controlling blood pressure and boosting physical activity, however, have been the subject of many more research over the years, and the ways in which they safeguard our brains are well understood.
Additionally, taking blood pressure is much simpler than using technology. This study’s strength is that it took these challenges into account by concentrating on some facets of technology use while excluding others, such brain training games.
These results are promising. However, we are still unable to conclude that using technology improves cognitive performance. To find out why this association might exist and whether these results hold true for other populations that were underrepresented in this study, particularly those from low- and middle-income nations, more research is required.
A question of ‘how’ we use technology
In actuality, living in the modern world without utilizing technology is just not possible. Nowadays, we do practically everything online, from making reservations for our next vacation to paying our bills. Perhaps we ought to consider our technological usage instead.
Reading, learning a new language, and playing music are examples of cognitively challenging activities that can help safeguard our brains as we age, especially in early adulthood.
As we become used to new software updates or learn how to use a new smartphone, increased usage of technology throughout our lives may be a way to improve our memory and cognitive abilities. This “technological reserve” has been proposed as potentially beneficial to our brains.
Additionally, technology may help us maintain our social connections and prolong our independence.
A rapidly changing digital world
Although the results of this study indicate that not all digital technology is harmful to humans, our interactions and dependence on it are changing quickly. Only in the upcoming decades will the effects of AI on the aging brain become clear. The future might not be entirely bleak, though, given our capacity to adjust to past technology advancements and the possibility that doing so could enhance cognitive performance.
For instance, improvements in brain-computer interfaces provide people affected by neurological illness or disability fresh hope. However, there are genuine risks associated with technology, especially for younger generations, such as poor mental health. Future studies will assist in figuring out how to maximize technology’s positive aspects while reducing its possible drawbacks.