Last scene of all, that ends this strange
eventful history,
Is second childishness, and mere
oblivion . . .
–S, As You Like It
Before the last century, only a small portion of the human population survived into the eighth decade of life. Those few individuals who successfully avoided the myriad causes of adult mortality – principally, infectious diseases, trauma, and cardiovascular failure – were expected to face a steady attrition of their most human qualities: memory, reasoning, judgment, abstraction, and language. In the popular mind, and even among scientists and philosophers, the idea that great age inevitably brought about an inability to think clearly was widely accepted. But intensive research into the pathology and biochemistry of the aging brain during the last few decades has revealed that specific diseases cause major impairment of cognition late in life and that the process of aging per se results only in relatively subtle changes in certain mental functions.
This reinterpretation of the nature of the aging mind has profound implications on both the personal and societal levels. In contrast to the assumption inherent in Jacques’ soliloquy, the passing of time does not by itself destroy our ability to think cogently. Rather, certain diseases that devastate those areas of the brain serving memory and cognition become increasingly prevalent after age 70 or so. For example, the two major causes of late-life dementia in most developed nations, Alzheimer’s disease and multiple small strokes (multi-infarct dementia), afflict just a few individuals in their forties or fifties, but the numbers rise very substantially in the mid-sixties and beyond.
In this sense, aging, the passage of time, does contribute to the development of dementing diseases in at least two broad ways. First, over time, the brain accrues molecular and cellular defects in neurons and glia, which reduce its physiological reserve, just as occurs in muscle cells with age. This process makes the brain more susceptible to loss of function if and when a neurological disease is imposed. Second, some of the specific diseases that cause dementia require great time to produce enough brain abnormalities, or lesions, to compromise function. For instance, in Alzheimer’s disease and certain other dementias, a lot of time is needed to reach a critical tissue concentration of particular proteins that allows for their polymerization into potentially toxic forms. In short, the process of brain aging can contribute to the development of a clinically noticeable dementing illness, but aging by itself appears to be insufficient to cause the illness.
Life expectancy at birth in the United States and in many other developed nations has risen from roughly fifty years in 1900 to more than seventy-five years in 2000, an unprecedented 50 percent increase in just one century. This sudden jump in average longevity is the result of major improvements in public health, intensive biomedical research, and subsequent pharmacological, surgical, and lifestyle interventions. It is by no means assured that life expectancy will continue to rise in the coming century, with the threat of highly resistant infectious diseases and an emerging epidemic of obesity and associated metabolic disease.1 Nevertheless, the sheer number of humans now surviving beyond eighty years and the accompanying social and economic stresses demand that the scientific community focus far more attention on the determinants of successful aging and the prevention of age-linked disease – particularly in the brain, which helps regulate non-neural organ function.
Based on personal observation, many people have come to realize that the aging process does not usually wreak havoc on the mind. But as recently as thirty years ago, gerontologists and neuroscientists were not at all sure of this conclusion and continued to catalog a complex array of relatively minor deficits in the numbers and biochemical properties of brain cells in aged mammals, including humans. Understandably, scientists focused mostly upon the health of neurons, the excitable cells in the brain that convey signals through electrochemical impulses – for example, a response to light impinging upon the photoreceptor cells of the retina or to sound waves vibrating the hair cells of the inner ear. Because the long cytoplasmic extensions of neurons, the axons and dendrites, pass information from one place to another in the brain, age-related defects in the innumerable molecules that allow them to do so could lead to cognitive failure. Indeed, scientists have documented a host of quantitative and qualitative changes in neuronal receptors, enzymes (specialized proteins that catalyze chemical reactions), structural proteins, and lipids in the brains of aged rodents, lower primates, and humans.
But when one counts the actual numbers of surviving neurons in aged versus middle-aged or young brains, most brain regions show very little or no significant neuronal attrition. This recent realization flies in the face of the long-held assumption that neurons steadily die out during the life span, a conclusion based on what we now recognize as technically flawed cell-counting methods. For example, the number of pyramidal neurons in certain areas of the hippocampus, a seahorse-shaped brain region critical for memory, does not decline appreciably in older humans.
On the other hand, the number of neurons in the substantia nigra – a small cluster of neurons in the brain stem that secrete the neurotransmitter dopamine – does decline steadily with age, perhaps because these cells produce the pigment neuromelanin as a by-product of their dopamine metabolism, a process that results in the excessive oxidation of proteins and lipids. The age-related dysfunction and loss of substantia nigra neurons likely contributes to the decreased speed and fluidity of movement and somewhat stooped, shuffling gait that very old people often display. This finding provides an example of the relationship of the aging process in the brain to diseases of the elderly. Age-associated nigral cell loss, which may normally amount to 30 to 50 percent or so of these neurons, is not sufficient to induce the clinical syndrome of Parkinson’s disease. However, this level of attrition may reduce the physiological reserve enough so that a superimposed insult, e.g., the presence of an inherited mutation in a specific gene or prolonged exposure to an environmental toxin, may elevate the degree of nigral cell loss to some 70 to 80 percent, enough to produce clinically apparent symptoms of Parkinson’s disease. But it must be added that the loss of neurons during normal aging in the substantia nigra is more severe and predictable than one observes in many other regions of the brain such as the cerebral cortex.
Even when the absolute number of neuronal cell bodies does not decline substantially, the brains of older mammals reveal a remarkable array of cellular and molecular alterations. There are defects in nuclear and mitochondrial DNA; in many different proteins, particularly enzymes; and in the lipids of the membranes enveloping cells and internal organelles. What bearing do these diverse molecular changes have on the mind?
For most of us, the answer is very little. In aged people without Alzheimer’s disease and other mind-threatening illnesses, the clinical effects of biochemical and anatomical alterations seem to be modest. In many studies reporting age-related neurochemical deficits – such as a reduction in a particular enzyme or in certain proteins or RNA molecules – the levels or functional activities in elderly adults have ranged from 5 to 30 percent below those in young adults. And though a 30 percent loss might seem quite high, such gradual declines over several decades often have little measurable effect on thinking. Indeed, positron emission tomographic (PET) scans and functional magnetic resonance imaging (fMRI) scans show that the brains of healthy people in their eighties are almost as active metabolically as those of people in their forties. In some brain regions such as parts of the frontal cortex, healthy aged humans may even exhibit more metabolic activity, though it is unclear whether this seemingly paradoxical rise in activity represents the brain’s attempt at compensation for some neuronal loss or just a nonspecific and potentially adverse recruitment of remaining local neurons.2 Overall, the aged brain tolerates relatively small deficits in neuronal structure and function rather well, although certain mental functions required for highly specialized activities – such as the rapid visual-motor tasks required to pilot a 747 or perform complex surgery – may become compromised in older humans.
Epidemiological and neuropsychological studies generally paint a similar picture to that emerging from neurobiological research. Estimates of the prevalence of senile dementia – the progressive loss of cognitive function after roughly age 65 – vary widely, but most data suggest that a large majority of individuals in their seventies and eighties are free of significant cognitive loss that interferes with daily function. And analyses of healthy elderly adults reveal only subtle declines in performance on tests of memory, perception, and language. One decrement on which numerous studies agree, however, is a reduction in the speed of some aspects of cognitive processing. Hence, septuagenarians are often unable to quickly retrieve certain details of a particular past event – say, the precise date or place – although they are often able to recall the information minutes or hours later. Given enough time and an environment that keeps anxiety at bay, many healthy elders score almost as well as young or middle-aged adults on tests of mental performance. A measure of guarded optimism emerges from investigations of ‘normal brain aging’: one may not learn or remember as rapidly later in life, but one may learn and remember nearly as well.
The range of brain diseases that express themselves as a progressive loss of intellectual function is remarkably broad. Vascular, metabolic, infectious, neoplastic, traumatic, and degenerative disorders can all present with symptoms of dementia.
At different times over the course of the last century, various disorders have assumed greater or lesser relative importance in contributing to late-life dementia. In the early 1900s, for example, neurosyphilis was considered a common cause of dementia; Alzheimer’s disease had not yet been recognized as a specific brain disorder. More recently, the proportion of dementia cases attributable to one or more strokes has declined because of the successful control of hypertension and hyperlipidemia and the gradual reduction in some types of cardiovascular disease. When Alzheimer’s disease comes under reasonable medical control, other disorders will assume greater relative importance in the differential diagnosis of late-life dementia.
But in developed countries today, Alzheimer’s disease is still by far the most common basis for senile dementia, accounting for some one-half to two-thirds of all cases. For several decades after Alois Alzheimer reported his index case, a 53-year-old woman from Frankfurt, the disorder was classified as a rare ‘presenile’ dementia, that is, a dementia having its onset prior to roughly age 65. But in the mid-1960s, three British scientists – Garry Blessed, Bernard Tomlinson, and Martin Roth – conducted landmark clinical-pathological correlative studies that made clear what some earlier investigators had suspected: common senile dementia is usually associated with the classical findings in the brain that Alzheimer had described. The term ‘senile dementia of the Alzheimer type’ was subsequently coined, but nowadays, ‘Alzheimer’s disease’ designates this syndrome, regardless of the age of onset. For research purposes, one still refers to ‘early-onset AD’ and ‘late-onset AD,’ divided arbitrarily at age 65, but little evidence exists that these are fundamentally distinct biological processes or that we could not ultimately treat them as one entity.
In the United States, multi-infarct dementia has long been considered the second most common basis for late-life dementia, even though Parkinson’s disease-associated dementia plus a related disorder, Lewy body dementia (named after the characteristic neuronal lesion that defines Parkinson’s disease), are now equally if not more prevalent. Careful microscopic analyses of autopsied Parkinson’s disease brains often reveal the features of AD or else AD plus Lewy body dementia, confounding precise diagnostic classification. Nevertheless, ‘pure’ Alzheimer’s disease is still the most common neuropathological basis for late-life dementia in the United States and most developed countries. A number of less common causes of dementia, including frontotemporal dementia and Creutzfeld-Jakob disease, share certain pathological or biochemical features with AD, but they are etiologically distinct.
Virtually everyone beyond late middle age has worried that an occasional memory lapse – a name forgotten or an object misplaced – could represent the earliest sign of AD. But such momentary losses, with recovery of the detail within minutes and a complete awareness of the lapse, are usually not progressive. In contrast, the repeated inability to remember recent, minor episodes of daily life – a call from a friend, a trip to the department store, the paying of a bill, a brief news story – can represent the earliest harbinger of AD. In a condition now referred to as ‘mild cognitive impairment (MCI)-amnestic type,’ the individual shows a subtle, intermittent decline in episodic memory but is otherwise intact cognitively and performs very well in everyday life. Evidence from structural and functional magnetic resonance imaging of MCI-amnestic brains suggests that the neuronal dysfunction is restricted to the hippocampus and a small number of other brain structures connected to it. Studies of the fate of MCI-amnestic subjects over time suggest that roughly 12 to 15 percent of them ‘convert’ to clinically diagnosable, mild AD each year, meaning that these individuals begin to exhibit signs of a more general disturbance of recent memory as well as disorientation to time and place, decreased attention span, confusion in executing complex tasks, and sometimes, difficulty in finding words. This slow progression of cognitive symptoms occurs in an individual who appears fully alert and demonstrates no abnormalities of the motor system, e.g., decreased mobility, stiffness, and slowed gait, until later in the disease.
What causes this initially subtle but ultimately devastating loss of higher cortical function? The answer has begun to emerge from three decades of intensive neuropathological, biochemical, and genetic research. While there is still earnest debate about the detailed sequence of events, the majority of scientists researching AD now believe that the misfolding, aggregation, and accumulation of a small protein of forty-two amino acids, the amyloid ß-protein (Aß), initiates a complex cascade of molecular and cellular changes that compromise neuronal function in brain regions serving memory and cognition.
According to this scenario, widely referred to as the ‘amyloid (or Aß) cascade hypothesis,’ a chronic imbalance between the production and the clearance of this otherwise normal protein arises in the brain long before the first symptoms of dementia. This accumulation leads to the self-association of Aß into ‘oligomers’ (doublets, triplets, quadruplets, etc.), which in turn can assemble into filamentous polymers (‘amyloid fibrils’) that clump together to form the cores, or spherical centers, of tiny plaques. These amyloid deposits are gradually surrounded by degenerating axons and dendrites (collectively called neurites) and activated brain inflammatory cells (microglia and astrocytes), completing the formation of so-called neuritic plaques.
During this slowly evolving process, some of the neurites within and adjacent to the emerging plaque develop rigid intracellular filaments, or ‘paired helical filaments,’ that are composed of a neuronal protein called tau. Tau filaments also accumulate in large bundles that comprise the neurofibrillary tangles found inside many neuronal cell bodies in the hippocampus and cerebral cortex, as well as in certain subcortical neurons that send their axons to these areas. In short, the accumulation and self-assembly of the Aß protein is believed to initiate a series of first functional (biochemical) and then structural (anatomical) changes in selected neurons, to the ultimate detriment of the thinking process.
Perhaps the most compelling evidence for this Aß hypothesis has come from identifying and characterizing genetic mutations that cause rare inherited forms of AD. It is a truism of modern biomedicine that searching patients’ genomes for faulty genes opens up the study of diseases of previously unknown cause and mechanism. For example, until the cloning of the Huntington gene in 1993, no one had any real clue as to what might be killing off certain brain neurons in patients with Huntington’s disease. In this and many other heritable diseases, the unbiased search of the human genome for the genes responsible for the disease allowed scientists to subsequently formulate biochemical hypotheses about what actually kills cells. But in the case of Alzheimer’s disease, the opposite sequence occurred: progress in the 1980s in understanding the biochemistry of the disease identified the proteins that comprise the plaques and tangles, providing geneticists with key clues to the location of the DNA mutations that might cause Alzheimer’s disease.
In 1991, researchers discovered the first mutation responsible for AD on chromosome 21, specifically in the gene that encodes the amyloid precursor protein (APP), the parent protein of Aß . In addition to the fact that APP molecules give rise to the Aß fragments that form the neuritic plaques, a crucial clue that the APP gene might be the site of an AD-causing defect came from a disorder at the opposite end of the life span: Down syndrome. Humans with Down syndrome, or trisomy 21, the most common form of chromosomal duplication compatible with life, invariably develop the plaques and tangles of AD in their thirties and forties. This is because they harbor three copies of the APP gene in all of their cells, rather than the usual two copies. The extra copy of the APP gene results in a roughly 50 percent increase in the cellular levels of the APP protein throughout life and the consequent start of Aß deposition in the Down syndrome brain as early as age 10.
Another powerful clue pointing to the APP gene had come from studying a family in the Netherlands with a history of multiple brain hemorrhages caused by the severe build-up of the Aß protein in cerebral blood vessels. In 1990, scientists discovered that a mutation in the APP gene that changes a single amino acid within the Aß region of APP was responsible for this rare disorder, demonstrating for the first time that mutations in APP could cause Aß accumulation.
With all of this knowledge in hand, geneticists scrutinized the APP region of chromosome 21 in a few families with a hereditary form of AD that led to the onset of dementia in the fifties. In one such family, they discovered a ‘missense’ mutation in APP that changed one amino acid near the end of the sequence encoding the Aß region to another. The study of other families with early onset of AD revealed additional APP missense mutations, most of which occurred in amino acids either at the beginning or at the end of the forty-two-residue Aß region. Tellingly, geneticists did not find any AD-causing mutations away from the Aß region of this large (770–amino acid long) precursor protein, indicating that the mutant amino acids might lead to increased cutting of APP at the beginning or end, resulting in the heightened production of the Aß fragment.
As these genetic findings were emerging, a major biochemical discovery was made: all cells normally produce the Aß peptide throughout life. Thus, Aß is the product of healthy APP metabolism in all of us, implying that unknown factors – genetic, environmental, or both – can increase its production or decrease its degradation in those individuals who develop AD, all of whom have too much Aß in their brains.
Putting together these two key observations – that healthy cells continually make Aß and that rare mutations within its precursor, APP, can cause AD – led to groundbreaking experiments. Inserting a gene that bore an AD-causing APP mutation into cultured cells resulted in significantly greater Aß production. Scientists could now study many details of the production and metabolic fate of Aß in simple cell models. They could also use such cells to screen large libraries of drug-like molecules and pinpoint compounds that lower Aß production without damaging the cells. And through the wonders of genetic engineering, scientists could also create ‘transgenic’ mice that express a human APP gene bearing an AD-causing mutation. After considerable trial and error, the latter approach generated several highly useful mouse lines that mimic several, but not all, features of AD in their brains, including the abnormalities of neurites and glia around the amyloid plaques. As they age, these mice develop deficits in cognition such as difficulty remembering how to negotiate mazes efficiently. Taken together, these and many other experiments have produced a wealth of evidence that AD can arise at least in part from an imbalance in the economy of the Aszlig; protein in brain regions important for memory and cognition. The practical outcome has been to encourage scientists to find ways to lower Aß levels in humans.
Still, there are many unanswered questions about the Aß hypothesis. What causes the imbalance in Aß levels in the brains of the large majority of AD patients who do not have known genetic mutations? For example, can environmental factors influence the brain’s Aß levels? Does the Aß peptide begin to aggregate inside the neuron before the Aß oligomers are exported into the extracellular space and then bind back to the cell? Which type of brain cell – neurons, microglia, or astrocytes – is the first to respond adversely to the excess of Aß in the local microenvironment? Precisely why do neuronal extensions, i.e., axons and dendrites, respond with an aggregation of their tau protein? Are the resultant tau aggregates the prime culprits in compromising neuronal function and ultimately killing the neurons? And perhaps most perplexing, how does the entire process select for neurons serving memory and cognitive function?
Answering all of these questions in detail should not be necessary in order to treat or even prevent Alzheimer’s disease. Because human genetic data and the modeling of the effects of the faulty genes in engineered mice have continued to support the Aß hypothesis, scientists in both academia and the biopharmaceutical industry have spent the last decade devising strategies to interrupt the Aß cascade at an early point in its development.3 Without knowing precisely how Aß compromises the functions of selected neurons, they have searched for compounds that can decrease brain Aß levels, initially in mouse models.
Three broad approaches have been conceptualized. First, one could partially inhibit one of the two specialized enzymes, ß-secretase and γ-secretase, that cut APP to release the Aß region. Second, one could allow these reactions, which occur normally in all of us, to proceed unimpeded but instead prevent a single Aß protein, a monomer, from binding with another to form oligomers, the small aggregates that appear to initiate the amyloid build-up and the associated short circuiting of neurons. Third, one could attempt to ‘clear’ the brain of various forms of Aß including monomers, oligomers, and larger amyloid deposits.
The first approach – inhibiting the protein-cutting enzymes that generate Aß – is somewhat analogous to the use of statin drugs to decrease cholesterol production. Several groups have identified inhibitors of ß-secretase, the enzyme that cuts APP first. But these inhibitors require modification to make them more potent yet still able to penetrate the blood brain barrier and achieve effective levels in brain tissue. At this writing, there are no such ß-secretase inhibitors ready for human testing. Scientists have also discovered many small molecules able to inhibit γ-secretase, the enzyme that makes the second and final cut of APP. Unfortunately, most of these molecules also interfere with the cutting by γ-secretase of a protein called ‘Notch’ that is crucial for the normal functioning of most cells. However, the serendipitous discovery that certain anti-inflammatory drugs like ibuprofen can gently ‘tweak’ γ-secretase to lower the production of Aß42, a particularly noxious form of Aß, without decreasing Notch cleavage has helped researchers continue to pursue this approach. And since the antiinflammatory properties of such drugs are not responsible for this selectivity, scientists have identified and are now testing in humans derivatives that solely tweak γ-secretase. Early trial results suggest that these specialized ‘γ-secretase modulators’ may indeed slow cognitive decline, at least in some AD patients.
The second approach, preventing the self-assembly of Aß into oligomers and fibrils, makes good theoretical sense but has received less attention. While some compounds have performed well in test-tube experiments, very small assemblies of Aß (dimers and trimers) can already interfere with synaptic function and behavior, raising concern that a partial inhibition of Aß aggregation might stabilize such small species and actually worsen the disorder.
The third approach – clearing Aß from the brain – has progressed the furthest to date, advancing into human trials. Here, the novel idea of immunizing patients with the very peptide that builds up in their brains has led to evidence in mice that one can efficiently clear Aß plaques with Aß antibodies. This has been accomplished in two ways: either actively vaccinating the mice with synthetic Aß so that they gradually generate their own Aß antibodies, or passively administering laboratory-made Aß antibodies to them. When the active vaccination approach was initially tried in AD patients, some 6 percent developed inflammatory cell infiltrates in the brain, or meningoencephalitis, and the trial stopped. The apparent reason for the inflammation: some patients had generated specialized T-lymphocytes directed against the tail end of the Aß peptide. Modified active vaccines comprising the front end only have now been designed but not yet tested in humans. In the meantime, a phase 2 trial of passive antibody administration is underway in AD patients, with initial results hoped for by late 2006.
In addition to the above approaches to the Aß part of the AD equation, there are strategies that attempt to target other key steps in the disease cascade. These include oxidative injury to neurons, the build-up of tau as tangles, local inflammatory changes, or a potential imbalance of certain metals such as copper and zinc in the AD brain. The use of cell culture and mouse models has assisted in the development of each of these potential therapies, followed in some cases by the initiation of clinical trials. At this writing, unequivocal evidence of successful slowing of the disease has not emerged, but hope runs high.
The advent of therapeutic agents that slow and perhaps even prevent AD could have profound effects on the aged human population, both on the individual and the societal levels. A vaccination strategy for a noninfectious disease in late life is unprecedented. Were a safe vaccine or another Aß-lowering therapeutic such as a γ-secretase modulator approved, healthy people might avoid the onset of Alzheimer-type cognitive loss by undergoing the therapy in late middle age or perhaps even earlier. Such an approach would have to include a formal, semiquantitative assessment of an individual’s likelihood of developing AD. Components of such a risk assessment may encompass a neurological examination that includes cognitive testing, a detailed family history, a blood screen for genetic mutations known to predispose to AD or other dementias, a blood test for plasma Aß levels, and a special brain imaging procedure like the emerging ‘amyloid scans’ that employ an injected chemical agent to visualize one’s cerebral Aß burden. Such a multicomponent assessment could assign individuals a rough probability of developing AD and perhaps other dementias, and those in moderate- or high-risk categories could then be offered one of the preventative agents envisioned above.
While such a combined diagnostic/therapeutic paradigm seems achievable with time, it raises difficult new questions. How can we administer such a relatively complex protocol to very large numbers of aging individuals? How will we pay for it? Will only relatively well-off individuals in developed nations have access to it, at least for the foreseeable future? And how will we handle the ethical challenges posed by widespread testing for the genetic risk of a major, brain-destroying disease?
And there are other social implications to ponder should a successful therapy for Alzheimer’s disease emerge from current research. The prospect of many more people retaining most of their cognitive functions into late life should accelerate the current trend toward longer careers, potentially displacing younger workers. And because improvements in the physical health of octogenarians will likely accompany the prevention of Alzheimer’s disease, and later other dementias, we will need to expand the availability of activities such as driving, entertainment, tourism, and financial services. Healthy elders themselves will presumably provide much of the labor required to deliver these services, but younger members of the work force should also benefit from these new opportunities.
Medical questions also abound. Could widespread access to effective therapy for late-life cognitive failure actually increase longevity? Certainly, the average life expectancy at birth would rise modestly, at least in developed societies, but will resolving dementia have a direct and measurable impact on the maximal age that humans achieve? Will many more people live to 90 or 100 with their mentation largely intact and then succumb fairly rapidly to other causes of mortality? And will other, currently infrequent forms of cerebral deterioration take the place of Alzheimer’s disease as the primary cause of dementia, just as Alzheimer’s emerged strongly after the eradication of neurosyphilis and the more recent decline in strokes?
The looming prospect of solving Alzheimer’s disease should be incorporated into the thinking of politicians, economists, and all those concerned about planning the future of our societies. While we will no doubt experience numerous fits and starts along the way, it appears increasingly likely that a world with less Alzheimer’s disease lies ahead.
ENDNOTES