Ontosight® – Biweekly NewsletterJune 17th – June 30th, 2024 –Read More
Drug prices are too expensive: Here’s how technology can fix that
This article was originally contributed to “Forbes.com”
The cost of drug development has skyrocketed, but disruptive technologies can bring it back down to earth. Over the past several decades, drug-development costs have risen significantly, from $250 million per approved drug prior to the 1990s to $403 million in the 2000s and $873 million in 2010 ($1.778 billion if capitalization over the 14-year approval period is accounted for). Costs are distributed across the development cycle, with about one-third in discovery/preclinical development, two-thirds in clinical development and 5% in submission-to-launch.
However, rising drug-development costs occur amidst the backdrop of a critical need for novel pharmaceutical products to treat conditions with a high cost of global death and disability. Critical modern needs include vaccines for neglected tropical diseases, chemotherapy targeted to an individual’s genome and novel antibiotics to combat antibiotic resistance.
Below we’ll discuss the top challenges in drug development and how disruptive technologies such as ARTIFICIAL INTELLIGENCE, blockchain and big data are poised to reduce costs and potentially solve these critical social problems.
Challenge No. 1: Increased Timelines
At 2011 estimates, the total drug approval timeline averages 14 years. The majority of the time is concentrated in R&D: about four-and-a-half years in discovery, one year in preclinical testing, one-and-a-half/two-and-a-half/two-and-a-half years in the three clinical development phases and 18 months in submission to launch.
One of the biggest bottlenecks in drug development is identifying promising disease targets, and ARTIFICIAL INTELLIGENCE has shown promise in automatically structuring big data. With this innovation, pharmaceutical companies can monitor public health data and correlate it with EMR using continuous analytics to generate novel insights on potential disease targets. For instance, the well-publicized correlation between Raynaud’s syndrome and fish oil supplements led to successful preclinical treatments for Raynaud’s disease.
Challenge No. 2: Unreliability Of Published Data
A 2011 correspondence published in Nature reported that only 20-25% of published preclinical targets were replicable in in-house pharmaceutical experiments for target validation. Authors politely stated that “An unspoken rule among early-stage venture capital firms that ‘at least 50% of published studies, even those in top-tier academic journals, can’t be repeated with the same conclusions by an industrial lab’” and hinted at a “pressure to publish.”
Dr. John Ioannidis was more plain-spoken in his meta-research paper titled “Why Most Published Research Findings Are False,” citing high-profile examples of needed reforms in scientific research and concluding, “Simulations show that for most study designs and settings, it is more likely for a research claim to be false than true.”
ARTIFICIAL INTELLIGENCE can help solve this problem by widely cross-referencing published scientific literature with alternative information sources less susceptible to bias, including theses, conference abstracts, unpublished data sets, public databanks and clinical trials.
Blockchain holds the potential to further reduce bias in clinical trials with “smart contracts” to create a tamper-proof digital ledger for data collection.
Challenge No. 3: The Increasing Complexity Of Clinical Trials For Chronic Diseases
The CDC refers to chronic diseases as “common, costly and preventable.” At current estimates, chronic disease accounts for 86% of the annual U.S. $2.7 trillion annual health care expenditure, which makes them a high priority for drug development.
However, chronic diseases have complex nonlinear predictive models, take years to develop, occur outside of structured health systems, involve habitual behaviour and interact with social determinants of health such as urban planning and income. The CDC estimates that the majority of chronic diseases are due to four modifiable risk behaviours: inactivity, inadequate nutrition and alcohol/tobacco use. All these factors have to be minimized in clinical trials for drug efficacy, and this leads to higher costs.
One of the best hopes for chronic disease management is Artificial Intelligence because it can structure big data from everyday life outside of the clinic, which is where most chronic diseases occur. As an example, Artificial Intelligence has been recommended to incorporate data from wearables and smartphones into clinical trials.
Challenge No. 4: Lower Success Rates From Agile Development Groups
By outsourcing the first phase of drug discovery to small organizations, large pharmaceutical companies can become more agile and innovative. But small development groups also have lower success rates on approved molecular entities than large organizations because they lack historical knowledge.
For a big pharma company, finding the right startup collaborators can be as important as finding the right compound. In the same way that Artificial Intelligence can track widely varying sources from discrepant contexts to generate a cohesive picture for drug discovery, it can also identify human talent by tracking public data. By surfacing trends from important deals and mergers, fast-tracked clinical trials and key opinion leaders, Artificial Intelligence can surface promising biotech startups from their myriad competitors.
In a cutthroat industry, pharmaceutical companies need to keep up or be left behind. Perhaps one of the strongest arguments for adopting Artificial Intelligence is that everyone else is doing it, too. NBC News reports that as of February 2018, 16 pharmaceutical companies and more than 60 startups were using Artificial Intelligence for drug discovery.
The question for pharmaceutical companies may not be whether to use Artificial Intelligence to cut costs but what Artificial Intelligence to use.
Pharmaceutical companies that decide to act now will prevent themselves from being caught off guard when the technology becomes more mainstream. They’ll also be recognized for improving the industry for patients and healthcare providers alike.
Featured Blogs
Machine learning as an indispensable tool for Biopharma
The cost of developing a new drug roughly doubles every nine years (inflation-adjusted) aka Eroom’s law. As the volume of data…
Find biological associations between ‘never thought before to be linked’
There was a time when science depended on manual efforts by scientists and researchers. Then, came an avalanche of data…
Find key opinion leaders and influencers to drive your therapy’s
Collaboration with key opinion leaders and influencers becomes crucial at various stages of the drug development chain. When a pharmaceutical…
Impact of AI and Digitalization on R&D in Biopharmaceutical Industry
Data are not the new gold – but the ability to put them together in a relevant and analyzable way…
Why AI Is a Practical Solution for Pharma
Artificial intelligence, or AI, is gaining more attention in the pharma space these days. At one time evoking images from…
How can AI help in Transforming the Drug Development Cycle?
Artificial intelligence (AI) is transforming the pharmaceutical industry with extraordinary innovations that are automating processes at every stage of drug…
How Will AI Disrupt the Pharma Industry?
There is a lot of buzz these days about how artificial intelligence (AI) is going to disrupt the pharmaceutical industry….
Revolutionizing Drug Discovery with AI-Powered Solutions
Drug discovery plays a key role in the pharma and biotech industries. Discovering unmet needs, pinpointing the target, identifying the…
Leveraging the Role of AI for More Successful Clinical Trials
The pharmaceutical industry spends billions on R&D each year. Clinical trials require tremendous amounts of effort, from identifying sites and…
Understanding the Language of Life Sciences
Training algorithms to identify and extract Life Sciences-specific data The English dictionary is full of words and definitions that can be…
Understanding the Computer Vision Technology
The early 1970s introduced the world to the idea of computer vision, a promising technology automating tasks that would otherwise…
AI Is All Hype If We Don’t Have Access to
Summary: AI could potentially speed drug discovery and save time in rejecting treatments that are unlikely to yield worthwhile resultsAI has…