Great ResignationClimate ChangeLeadershipInflationUkraine Invasion

Changing the way we pay for genetic testing will save lives

August 31, 2021, 4:30 PM UTC
“Health care systems need to make it easier for health information to be incorporated into routine preventive care, and that begins by making sure it gets paid for,” writes Sean George.
Getty Images

Today we’re in the midst of the most fundamental shift in medicine in a century, on par with the era of antibiotics. The use of genetic information—about people, about tumors, about viruses—is taking the guesswork out of medicine, giving us the power to leapfrog the trial-and-error of the past to a new model of medicine that leverages molecular information to deliver better, more precise health for each of us.

The COVID-19 pandemic and the astonishing speed with which vaccines were developed have shown what’s possible in this new era of medicine. So too have recent improvements in cancer care. More than 90% of the therapies in development for cancer are in some way directed by molecular genetic information, according to the IQVIA Institute. Our technology is rapidly escaping the limitations of symptom-based, trial-and-error medicine, and our loved ones are surviving longer as a result.

Yet the pace of embracing genetics-based health care has been excruciatingly slow. Nowhere are these shortcomings more evident than in cancer care. We know that cancer survival increases with early detection and treatment, and yet clinical care guidelines and reimbursement favor late-stage detection and treatment. A case in point is colorectal cancer, which has been on the rise, particularly at younger ages. Nearly one in six colorectal cancer patients has inherited gene variants that increased his or her risk of cancer, not just for colorectal cancer but for other cancers as well. And yet current testing guidelines and practice focus on genetic testing only for the sickest patients and for families with an obvious familial pattern of inherited cancer.

If genetic testing for elevated cancer risk were a standard part of everyday care, we could fashion appropriate prevention and surveillance matched to each individual’s risk for cancer, as well as intervene earlier if cancer develops. So why don’t we routinely test patients in order to better manage care?

Not to put too fine a point on it, but no one gets paid for identifying asymptomatic people who are at high risk for a disease and then implementing prudent (and cost-effective) screening strategies to help them thrive. Our system pays to treat patients after disease has run undetected and unchallenged, often for years. The unintended consequences of a patchwork of regulatory barriers combined with a completely broken pricing market continues to ratchet our country’s intellectual capital toward the costly (and lucrative) game of diagnosing and treating the sickest patients; there are few incentives for making people healthier. Sadly, U.S. payers seem trapped in this approach. Progressive insurers try pilot programs, employers occasionally try to incentivize prevention to bring down costs, but systemic change remains elusive.

Interestingly, some of the biggest innovations are coming from outside health care. We’re beginning to see life insurance companies provide molecular testing to determine risk for cancer, therapy, and monitoring, and they are witnessing early returns. People live longer—and remain customers longer—when they are protecting their health. 

The health care sector of our economy should not and cannot abandon its moral imperative to adopt preventive medicine. Systems need to make it easier for health information to be incorporated into routine preventive care, and that begins by making sure it gets paid for. Health care is traditionally quite siloed, but we need to be thinking of it as a platform, one that payers and health care systems can collaborate to create.

The collaboration starts with making diagnostic and molecular tests affordable and the resulting information and health data accessible to patients and clinicians. Through the early days of the pandemic, we witnessed what happens when testing and information infrastructure is missing. The need is just as great outside infectious disease, it’s just not as visible. Payers have a vital role in rethinking reimbursement for preventive medicine, and industry must meet the need with reliable, scalable testing.

Industry partnerships can then be layered on that testing and informatics infrastructure to increase utilization. If you have a targeted therapy for cancer, you can’t bring it forward unless you have the testing for that biomarker. But we can keep expanding the menu of genomic information available until it becomes routine enough that genetics becomes a virtuous cycle: The more individual genetic information we have, the more targets the innovators in the biotech and pharmaceutical industries will have to drive the development of new therapies, which in turn spurs more genetic testing to find more individuals who could benefit from such targeted treatment. This allows us to build a scientific knowledge base of what works for whom, across all of health.

This is all, without a doubt, easier to describe than achieve. But if the pandemic crisis has shown us anything, it is the rapid innovation that results when all the parties involved in health care collaborate to solve a problem. A pandemic emergency and emerging infectious diseases are not situations we can afford to wait for. We need to take the lessons we’ve learned over the past year and a half and extrapolate them beyond COVID to deliver a lifetime of better health for billions of people around the globe.

Sean George is the cofounder and CEO of Invitae.

Subscribe to Fortune Daily to get essential business stories straight to your inbox each morning.