What’s New in Ontosight® Terminal 1.1 A Complete Guide toRead More
How AI is transforming the future of healthcare
I recently wrote about five technologies that are shaping the future of healthcare. One of these technologies — artificial intelligence — holds particular potential for improving medical care at the clinical level.
Thanks to the digital revolution, medical professionals don’t have to memorize nearly as much information as they did 50 years ago. Digital technology has liberated physicians, nurses and researchers to focus more mental energy on higher-level cognitive tasks and patient care. Artificial intelligence is poised to take this to the next level.
The medical field must learn to better delegate repetitive, lower-level cognitive functions in order to allow medical professionals to focus more of their mental energy on higher-level thinking. To understand this need, let’s start by looking at a quote from J.C.R. Licklider’s 1960 paper Man-Computer Symbiosis:
“About 85% of my ‘thinking’ time was spent getting into a position to think, to make a decision, to learn something I needed to know. Much more time went into finding or obtaining information than into digesting it … Several hours of calculating were required to get the data into comparable form. When they were in comparable form, it took only a few seconds to determine what I needed to know.”
Herbert A. Simon captured a similar idea when he coined the phrase bounded rationality. The idea is that human decision making is at its best when people are given limited, relevant information and enough time to process the information.
Computers allow us to optimize our decision-making faculties by granting us easier access to information that is critically relevant to a decision while sorting out non-relevant facts or data. Humans now spend less time trying to decide what information to look at and can spend more time applying our minds’ higher-level computational abilities to the information before us.
As AI continues to advance, it has the potential to extend the power of human thinking in three critical areas: advanced computation, statistical analysis and hypothesis generation. These three areas correspond to three distinct waves (paywall) within AI development.
First-Wave AI
The first-wave of AI was composed of “knowledge engineering” technology and optimization programs, which found efficient solutions for problems in the real world. Examples include scheduling platforms that improve efficiency or internet-based tax-filing products.
First-wave AI technology has been applied in some ways to the world of medicine. The Framingham Risk Score Calculator, for example, uses AI to estimate a patient’s heart disease risk. However, many opportunities exist to expand the application of this technology within the healthcare industry.
 [;
Second-Wave AI
Second-wave AI was characterized by machine learning programs which utilized statistical probability analysis to conduct advanced pattern recognition. In contrast to first-wave AI, second-wave AI perceives and learns — sometimes more effectively than humans can.
This evolving pattern-recognition technology has been applied to the medical field in the form of “clinical decision support systems” and other programs that are used to analyze and evaluate genetic test results, retinal scans and echocardiograms. But there remains room for improvement: These programs still cannot fully replace human assessment because they have not matched humans’ capacity for deep data interpretation.
Second-wave AI tech depends on clean and properly coded data sets to learn from. So while its ability to learn and perceive has reached an impressive level, it is limited in its ability to solve novel problems that lack sufficiently clean or comprehensive datasets.
Third-Wave AI
The world is now entering the third wave of AI, in which programs normalize the context of various data in order to generate novel hypotheses. These technologies are capable of examining huge data sets, identifying statistical patterns and creating algorithms to explain the patterns.
The enormous potential of third-wave AI programs lies in their ability to increase the quantity of data that can be meaningfully analyzed. These programs identify connections between previously unassociated data points by normalizing the contexts of the various points. This allows for the simultaneous generation and testing of novel hypotheses in a host of healthcare scenarios.
Take, for example, healthcare companies such as Johnson & Johnson, who have started to invest in advanced AI programs to gain a competitive advantage. Such technologies have driven significant scientific discoveries, including the correlation between fish oil and Raynaud’s disease.
Third-wave AI is game-changing because it has the potential to automate almost any repetitive manual task. It can both learn from and explain complex statistical patterns and can teach humans what it learns. These technologies still have a way to go before their full utility will be realized, but the potential is significant.
AI In The Future
Even with the advent of third-wave AI, computers are unlikely to replace the diagnostic role of physicians in the near future. AI programs are still too limited, for example, to assess a wound for infection by synthesizing aesthetic observations such as heat, color, smell, pain level and drainage.
AI is now sufficiently sophisticated to automate many of a physician’s tedious, repetitive tasks, however. It can, for example, reduce the time required to analyze a bacterial swab and recommend a suitable antibiotic prescription. This gives the physician more time and mental energy to perform higher-level functions such as patient education and clinical assessment.
The potential healthcare applications for AI technology are numerous and exciting. Healthcare providers are exploring the application of AI programs to insurance verification, skin cancer diagnostics, the analysis of lab results and medical record data analysis. We’re only now beginning to explore the depths of healthcare innovation that may be unlocked by continued advancements in AI technology.
As AI applications become increasingly integrated with medicine, more and more people will gain access to high-quality, efficient healthcare.
About the author:
Gunjan Bhardwaj is the founder and CEO of Innoplexus, a leader in AI and analytics as a service for life science industries. With a background at Boston Consulting Group and Ernst & Young, he bridges the worlds of AI, consulting, and life science to drive innovation.
Featured Blogs
Machine learning as an indispensable tool for Biopharma
The cost of developing a new drug roughly doubles every nine years (inflation-adjusted) aka Eroom’s law. As the volume of data…
Find biological associations between ‘never thought before to be linked’
There was a time when science depended on manual efforts by scientists and researchers. Then, came an avalanche of data…
Find key opinion leaders and influencers to drive your therapy’s
Collaboration with key opinion leaders and influencers becomes crucial at various stages of the drug development chain. When a pharmaceutical…
Impact of AI and Digitalization on R&D in Biopharmaceutical Industry
Data are not the new gold – but the ability to put them together in a relevant and analyzable way…
Why AI Is a Practical Solution for Pharma
Artificial intelligence, or AI, is gaining more attention in the pharma space these days. At one time evoking images from…
How can AI help in Transforming the Drug Development Cycle?
Artificial intelligence (AI) is transforming the pharmaceutical industry with extraordinary innovations that are automating processes at every stage of drug…
How Will AI Disrupt the Pharma Industry?
There is a lot of buzz these days about how artificial intelligence (AI) is going to disrupt the pharmaceutical industry….
Revolutionizing Drug Discovery with AI-Powered Solutions
Drug discovery plays a key role in the pharma and biotech industries. Discovering unmet needs, pinpointing the target, identifying the…
Leveraging the Role of AI for More Successful Clinical Trials
The pharmaceutical industry spends billions on R&D each year. Clinical trials require tremendous amounts of effort, from identifying sites and…
Understanding the Language of Life Sciences
Training algorithms to identify and extract Life Sciences-specific data The English dictionary is full of words and definitions that can be…
Understanding the Computer Vision Technology
The early 1970s introduced the world to the idea of computer vision, a promising technology automating tasks that would otherwise…
AI Is All Hype If We Don’t Have Access to
Summary: AI could potentially speed drug discovery and save time in rejecting treatments that are unlikely to yield worthwhile resultsAI has…