Bio-Research: Not Much to Show for a Big Investment...Yet
Biomedical research spending has more than doubled since 2000, while new drugs have been in a steep decline – is it time to refocus priorities?
July 15, 2010
America 3.0 is a new column that assesses fiscal issues across America as if this nation were a company and its citizens were stockholders. Each business plan offers a formal set of goals and aspirations, and realistic strategies for attaining those goals.
In America, we love discovering new things. Since Thomas Jefferson became the first inventor-in-chief and a powerful advocate of science and technology, the U.S. has lavished both private and public resources on research and development. Mostly, these investments have paid off handsomely, resulting in applied technologies that range from cotton gins and steam engines to microchips and gene-based drugs.
Not all investments in research and development, however, succeed by creating new products in a reasonable amount of time. This seems to be the case with the unprecedented expenditures on R&D in the life sciences over the past decade, which includes biotechnology, pharmaceuticals and biomedical technologies. Since 2000, America has shelled out close to $1 trillion in public and private spending on life sciences — more than twice the amount spent in the 1990s — with surprisingly little to show for it in the way of tangible products.
As shareholders in America, Inc., we’re obligated to take a close look at why, and what to do about it.
In 2008 alone, the federal government spent $38 billion on life sciences R&D, mostly through the National Institutes of Health — up from about $10 billion just 20 years ago. The private sector spent an additional $75 billion, a nearly five-fold increase over the same period.
Compare this to $80 billion spent on defense R&D last year, and about $2.2 billion for federal energy R&D (all forms, including fossil fuels and alternative fuels), and you begin to understand the magnitude of this historic investment.
Who are You?
On the plus side, this outpouring has produced a revolution in our basic understanding of the human organism. Since 2000, scientists have completed the sequencing of the human genome, and have been drilling down to better understand brain function, disease, aging — you name it. Armies of researchers have produced so much data that we have moved from measuring it in megabytes (billions) 10 years ago to petabytes (quadrillions) today. Soon we will be moving into the age of exabytes (quintillions) of data.
Some gains have been made — in treating chronic diseases such as heart disease and diabetes, and in reducing stroke and some cancers. Mostly, though, we have been unable to leverage the explosion of research into a cornucopia of new and less expensive drugs, diagnostics and protocols that most people expected a decade ago.
Instead, the era of mega-funding for basic research has seen a shrinkage in the number of new drugs approved by the FDA — from an average of 35 a year in the late 1990s to around 22 a year since 2000. Likewise, only a handful of the thousands of genetic markers and other biomarkers identified by researchers as being associated with disease have resulted in a direct health benefit. Google founder Sergey Brin, who discovered he has the gene for Parkinson’s disease, has already contributed $50 million to Parkinson’s research.
The human organism remains far more complex than anyone thought.
New technologies and drugs developed in the past 10 years have proved so expensive to develop — as much as $2 billion per successful drug — that the price of drugs has gone up dramatically, with some treatments costing tens of thousands of dollars per person, or more. This defies the normal pattern of new technologies such as microchips and cell phones, which usually become cheaper over time.
Which leads us shareholders to ask, “Why did this happen?”
In part, the dearth of new drugs, along with rising costs, comes from pharmaceuticals that were pulled from the market after showing unacceptable side effects. The most famous among them — Merck’s Vioxx. This led to costly lawsuits and more regulation. Underlying this, however, has been an unanticipated outcome of the “new biology” that may be more fundamentally to blame: despite the gains in new knowledge, the human organism remains far more complex than anyone thought.
A decade ago, scientists talked seriously about the possibility that a deeper understanding of how our bodies work would let loose a rash of new drugs based on this new knowledge. The idea was to replace the old trial and error method for finding and developing drugs, in which it didn’t matter how the drugs actually worked, as long as they did. This has led to a situation that Socrates might have appreciated when he said: “The more I learn, the more I learn how little I know.”
Basic research should get significant funding, but not
at the expense of applying what we have already learned.
The next agenda item for today’s shareholder meeting is what to do about this conundrum.
The first and most obvious step is to assess what we’ve bought with all of these billions, and to create a coordinated strategy to test and validate discoveries that have the greatest chance of success. Currently, the NIH spends under a billion dollars for “translational medicine” — formal projects to convert basic science into applied medicine. This amount needs to be increased as part of a comprehensive plan — not with new money, but by redirecting money from basic research to implementation strategies. Basic research should continue to get significant funding, but not at the expense of applying what we have already learned.
Smoothing the Transition from Beaker to Bedside
An example would be to take the thousands of genetic markers that scientists have tentatively linked to a high risk for disease — which have cost taxpayers billions of dollars to identify — and systematically test them in the clinic to find out if they are useful or not. These include markers that help identify individuals that might experience dangerous side effects from drugs such as cholesterol-lowering statins, or individuals with a genetic variation that prevents certain drugs — including the antidepressants that include Prozac — from working.
The concept is called pharmacogenomics, which if implemented could reduce health care costs by prescribing only those drugs that actually work for certain individuals. To date, most of these markers have not been validated in clinical trials and approved by the FDA, nor is there a comprehensive plan to test them.
The new master plan should encourage a closer relationship between scientists and doctors to smooth the transition from beaker to bedside. Regulators at the FDA and payers such as Medicare and Medicaid also need to focus on integrating translational projects that rapidly move research into the clinic.
In recent years, some of our greatest minds have spent a fortune on disassembling the human body and studying it like they would a very complicated automobile. Now it’s time for America, Inc. to take what’s been learned and use it to lower health care costs and to build a better product — that product being us.