Have you seen the stories about new tests for Alzheimer’s Disease that supposedly are just around the corner? The claims: A simple blood test, an eye test, even a smell test that could show that you are at high risk for Alzheimer’s decades before you develop symptoms. They are promising, scary, and–so far–premature.
Sadly, we’ve seen this movie before. Like most Alzheimer’s-related science, there turns out to much less to these stories than their promoters claim. For years, we’ve heard about the drug cures, or the brain-training products, diets, or exercise programs that can delay dementia. None have panned out. And it is far too early to know if cheap, accurate, and non-invasive early testing will either.
If one or more does prove reliable, it will raise important ethical, medical, and financial questions that we have yet to think through even though the consequences of Alzheimer’s testing have been quietly debated for a long time. Here is a piece I wrote seven years ago on the issue.
Long-term care insurance
One consequence: An accurate, widely-available test to detect Alzheimer’s years before a person shows symptoms will destroy the market for voluntary long-term care insurance. Why? Because people who test positive will be far more likely to want to buy insurance than those who don’t. And because dementia is the single most important condition that affects people who go to claim, premiums would explode. Already, half of long-term care insurance claims are for people with dementia.
If insurance companies have access to those test results, they will refuse to insure those who test positive (at least not at an affordable price). If the government bars insurers from seeing the results, carriers will simply assume that prospective buyers are more likely to have tested positive for Alzheimer’s and price all policies accordingly. Either way, the already-crippled voluntary long-term care insurance market would almost-certainly die.
But that’s just one question. How would patients react to knowing they are more likely to suffer the effects of an incurable disease decades from now? How would it affect their medical care? Imagine someone needs a kidney transplant. Would they fall off the waiting list because they are more likely to show symptoms of dementia in a decade?
And what would “more likely” mean? Would a test show that someone is 100 percent certain to develop cognitive impairment years from now? Fifty percent more likely? And, remember, Alzheimer’s is only one of many cognitive diseases. Just because a person tests as low risk for Alzheimer’s, she may still contract vascular or Lewy Body dementia.
Drug researchers are anxious to develop a test that identifies high-risk people years before they show symptoms. One reason: It would make new drug testing possible.
Some argue that anti-Alzheimer’s drugs have been unsuccessful because they have been administered too late—after patients begin to show symptoms. Thus, researchers would like to try these drugs sooner. But how can they do that if they don’t know who is at risk? And ethically, how could they experiment on someone who may not have the disease they want to treat. An accurate, non-invasive test would help solve those problems.
On balance, an accurate, low-cost test for one form of dementia is a good thing, and it is probably inevitable even if it isn’t on the immediate horizon. That’s why we need to start thinking today about its medical, ethical, and financial consequences.