Dementia risk factors – untangling fact from fiction
Worldwide, there are around 50 million people with dementia, a figure predicted to rise to 152 million by 2050. We urgently need to better understand what factors can increase our risk of the devastating diseases that cause dementia, so that this knowledge can ultimately be translated into preventative strategies. Most of us will have seen sensational headlines in the press claiming that a new study has identified a simple lifestyle habit or superfood that can significantly lower the risk of developing dementia. But, as this article discusses, dementia is particularly challenging to study, and we need to be very cautious when interpreting research evidence. So how should researchers go about answering the question we all want to know: how can we best protect ourselves against dementia?
Challenge one: measuring cognitive decline
Around 40% of worldwide dementia diagnoses are thought to be caused by 12 modifiable lifestyle factors. Many of the associations between lifestyle factors and dementia risk are based on results from large-scale observational studies, such as the UK Biobank Study and the Million Women Study. These studies collect data on various lifestyle, physiological and genetic factors for a large number (usually thousands) of people who have consented to take part. By linking to participants’ health records, epidemiologists track the group over time and then analyse the data to see if any factors are significantly associated with an increased risk of developing a specific disease.
However, cognitive decline (which typically precedes dementia) can affect individuals in very different ways, depending on the pathology of the underlying disease and which region of the brain is affected. Consequently, a wide range of different methods have been developed to assess cognitive decline in these large-cohort studies. These include assessments of mental functions (eg the Mini-Mental State Examination, MMSE and the Mini-Cog test) such as concentration, verbal fluency and language skills, short/long-term memory, and awareness of time and place. Meanwhile, advanced imaging techniques such as MRI scans can identify signs of tissue shrinkage or vascular problems in the brain. These different types of pathology often precede dementia and can contribute to worsening cognitive decline. Other methods assess cognitive decline indirectly, for instance questionnaires on the individual’s ability to carry out everyday activities, such as the Bristol Activities of Daily Living Scale (BADLS).
With so many different methods used, it can be difficult to compare studies, for instance, as part of a meta-analysis of all the evidence. Dr Thomas Littlejohns, senior epidemiologist in the Translational Epidemiology Unit (TEU) said: ‘Cognition is highly complex. There are numerous different tests which measure either specific aspects of cognition, such as memory or executive functioning, or provide a measure of global cognition. These measures are incredibly useful in a research setting as they can enable us to investigate risk factors in association with cognitive decline prior to dementia. But it can be challenging to interpret results across studies which use different cognitive tests.’
Challenge two: overcoming reverse causality
It can take up to ten years from the first symptoms for an individual to be diagnosed with dementia. As a result, studies on dementia can be affected by reverse causation. This is the effect which occurs when people who are suffering from the early stages of a disease, but have not yet been diagnosed, change their behaviour because of their preclinical illness. In retrospect, this can make it appear that the disease resulted from the change in behaviour.
According to Dr Sarah Floud from Oxford Population Health’s Cancer Epidemiology Unit (CEU), to combat the effects of reverse causality you need two things: large-scale studies that include a high number of dementia cases, and long follow-up periods. A good illustration of this comes from Dr Floud’s work to investigate the associations between dementia risk and participation in cognitive/social activities, and physical activity. The overall evidence from previous studies had been inconclusive: many of these investigations either had short follow-up periods of ten years or less (making it difficult to account for reverse causation), or they had long follow-up periods but relatively few cases of diagnosed dementia, reducing their statistical power.
Dr Floud used data from the UK Million Women Study: an ongoing study which recruited one in four of all UK women born between 1935 and 1950, and includes over 30,000 people with dementia. The results showed that although the risk of developing dementia was 60% higher for inactive women during the first five years of follow-up, this risk reduced substantially over time, becoming almost zero after 15 years. Similarly, non-participation in cognitive or social activities was only associated with an increased risk of dementia during the first decade of follow up, with little or no association during the second decade. ‘The insidious onset of dementia makes it difficult to disentangle which factors are likely to represent causes of dementia, and which are consequences of very early dementia symptoms. The UK Million Women Study has the long-term follow-up and large size needed to investigate this and we are continuing to examine whether other risk factors for dementia are likely to be true causal risk factors’ Dr Floud said.
Challenge three: the genetic factor
An individual’s risk of developing dementia is also influenced by genetics; Alzheimer’s disease, for example, has been associated with over 60 gene variants so far. Although the exact role of many of these is currently unclear, these variants are already revealing new insights about the origins of disease and potential targets for treatments. ‘We have learned so much from both rare gene variants with large effects, and more common ones with smaller effects’ says Cornelia van Duijn, Professor of Epidemiology at Oxford Population Health. ‘For instance, genetic studies brought to the surface how many other cells besides neurones are involved, including astrocytes and microglia, besides major pathways such as lipid metabolism and the innate immune system.’
The challenge now facing researchers is to unpick how these genetic factors interact with environmental and lifestyle factors. According to Professor van Duijn, it’s a numbers game: ‘Studies need to be four to ten-fold larger to study the interaction between two factors compared with studying the factors by themselves’ she says. However, as genetic sequencing technologies continually improve, more large-scale cohort studies are adding whole-genome sequencing (WGS) information to their databases. This includes UK Biobank’s ambition to add WGS data for all 500,000 participants. Naomi Allen, Professor of Epidemiology at Oxford Population Health and Chief Scientist for UK Biobank, says ‘These data, combined with the lifestyle, environmental and health outcome data already available in the resource, will significantly enhance our understanding of the role of genetic factors in the development of dementia (and a wide range of other health outcomes) and will advance drug discovery and development.’
Challenge four: untangling correlation from causation
Even when studies consistently find that a factor is strongly associated with an increased risk of dementia, this does not necessarily mean that the two are linked by a direct, causal mechanism. For instance, greater exposure to air pollution has been associated with a higher risk of dementia, but this could be due to many different reasons. Potentially, air pollution may directly damage the brain by increasing oxidative stress and neuroinflammation, or by causing tiny magnetic particles to lodge in the brain. Alternatively, air pollution may increase the risk of dementia through aggravating other risk factors such as, heart disease. There could also be a confounding factor, such as socio-economic deprivation, since people who live in less polluted areas may be able to afford a higher overall quality of life.
Ultimately, the best way to separate correlation from causation is to use a gold-standard, randomised controlled trial, as demonstrated by the example of Omega-3 polyunsaturated fatty acids (PUFAs). Various observational studies have reported an association between high fish consumption and a reduced risk of cognitive decline – up to 70% lower in some cases. It was proposed that this was due to oily fish being good sources of Omega-3 PUFAs, which are essential for brain function and development. This helped promote a thriving market in Omega-3 PUFA supplements, marketed to promote ‘lifelong healthy brain function.’ Yet evidence from clinical trials suggest that Omega-3 PUFAs may not be an effective treatment for people already with dementia, and it has not yet been proven through controlled studies that these supplements can prevent or slow the onset of dementia in healthy people. Potentially, the observed association between Omega-3 PUFAs and dementia risk may be linked to another dietary factor, or lifestyles that are healthier in general.
As an add-on to the ASCEND clinical trial, Professor Jane Armitage from Oxford Population Health’s Medical Research Council Population Health Research Unit is providing new evidence on this by looking at dementia and cognition (memory and thinking skills) in the participants of the ASCEND trial who were randomised to aspirin or placebo and also to fish oils or placebo for over seven years. Results for aspirin have shown that taking daily low-dose aspirin does not affect the risk of dementia or mental decline among adults with type 2 diabetes.
Challenge Five: matching real-world contexts
Ultimately, health research can only be as good as the data that powers it. This makes it important that studies use measures that are accurate, and reflect real-world situations as closely as possible. For instance, impaired hearing has been associated with an increased risk of dementia, but standard methods to assess hearing ability can miss certain types of hearing impairment, since they tend to be based on picking out pure tones in quiet environments. Recently, Dr Littlejohns led a study investigating the association between hearing impairment and dementia, based on data from the UK Biobank. In the study, over 82,000 participants were tested specifically on their ability to pick out speech in noisy environments (‘speech-in-noise’ hearing) using the Digit Triplet Test. ‘We found those with insufficient and poor speech-in-noise hearing had a 61% and 91% greater risk of developing dementia compared to those with normal speech-in-noise hearing’ said Dr Littlejohns. ‘Consequently, an even greater number of people than previously thought who are affected by hearing loss may be at increased risk of dementia. Potentially, this offers the possibility of reducing a significant number of new dementia diagnoses if impaired speech-in-noise hearing for these people can be improved, for instance by using hearing aids.’
Although dementia will likely always be challenging to study, Oxford Population Health researchers are addressing this through combining large-scale studies with innovative big data approaches. In this way, we are playing a key role in generating the evidence which will lead to more effective treatment and prevention of these diseases.