Until last month, the only official way to diagnose Alzheimer’s Disease was by cutting into the brain after death, looking for telltale “tangles” and plaque. But last month, a national panel of experts announced new criteria to make the diagnosis—while people are still alive, and as long as 10 years before they even show symptoms.
Dr. John Morris, who directs the Alzheimer’s Research Center at Washington University School of Medicine, was on that panel, convened by the National Institute on Aging and the Alzheimer’s Association. I called to ask him about the hurricane-force blowback that’s followed the announcement.
Physicians are saying that the new evidence for diagnosis—biomarkers like brain scans, MRIs, and spinal taps—is too expensive to be accessible to everyone, not to mention painful, inconclusive, and psychologically damaging. Who wants to know their brain shows signs of an imminent disease, when there’s no cure and not much treatment available?
“It is quite true that the biomarkers have not yet been thoroughly validated in all populations, and not all biomarkers are going to be accessible to all patients,” Morris concedes. “But that’s overlooking the point that now we do have those biomarkers, and all those issues are going to be worked out.”
He pauses.
“This is a new era,” he says, spacing his words. “We can see the brain.”
I ask—tentatively—whether seeing plaque in the brain would be conclusive, since apparently quite a lot of people have plaque. “Well, they do, but that’s because they have pre-clinical Alzheimer’s Disease!” he exclaims. Not everyone winds up developing symptoms, and researchers are still trying to fathom who will and who won’t.
But overall, Morris agrees with the critics: In most cases, it is too soon for diagnosis. If someone comes to him wanting to get tested because, say, her mother developed Alzheimer’s, he urges her to think carefully first, about everything from future insurance and employment to the panic of knowing that at some unknown point in the future, her thinking will begin to cloud, and dementia will set in, and there’s not a damned thing she can do to prevent it.
Yes, Alzheimer’s can be (under rare or complex conditions) hereditary. Yes, there are new medications that might prove helpful in the earliest stages. But those meds aren’t likely to get approved for use for another decade, he notes.
On the other hand, if the woman has a stronger reason to think she possesses the rare gene that makes Alzheimer’s hereditary—say she’s already showing symptoms, or her mother developed the disease unusually early, in her thirties—Morris’ answer might change. Early diagnosis can make a difference, because current medications are likely to be far more helpful now than they will be as the disease makes deeper inroads into the brain.
What’s important about the new criteria, Morris says, is, first, that they reframe the disease itself, defining it “as a disorder of the brain regardless of whether or not the symptoms of the disorder are expressed. Our own work here at Wash. U. suggests that from the beginning of the brain changes to dementia, it’s somewhere between 10 and 15 years.” Physicians need to learn to recognize early symptoms, not write them off with labels like ‘mild cognitive impairment.’”
His second point: “Most people, when you say ‘Alzheimer’s,’ have an image of someone who is pretty far down the road. I think the majority are probably in the mild stage, and they’re still able to make decisions. I think we ought to ask them what they want.”
The new criteria will likely be adopted this fall, and experts are predicting that the number of Americans with Alzheimer’s will soon look more like 10 or 15 million than the current 5.3 million estimated by the Alzheimer’s Association.
“The Baby Boomers are turning 65,” Morris points out. “It’s going to be a major epidemic.”—Jeannette Cooperman
Morris leads a study of adult children of parents with Alzheimer’s. Participants do not learn their test results, but they do learn about the latest research.