Watson, I presume?
By Joe Petro, March 2012
A doctor it’s not, but artificial-intelligence technology similar to that used by IBM’s Watson to win ‘Jeopardy!’ may make solving future healthcare data and diagnostic problems elementary.
In 2011, we all witnessed the man vs. machine showdown when IBM’s computer, Watson, successfully outsmarted all-time “Jeopardy!” champions three nights in a row. While the application of Watson to a game show is entertaining, what’s really intriguing is imagining how Watson-like technologies could be introduced in healthcare to extend the value of clinical information.
According to IBM, “Watson’s ability to understand the meaning and context of human language, and rapidly process information to find precise answers to complex questions, holds enormous potential to transform how computers help people accomplish tasks in business and their personal lives. Watson will enable people to find specific answers to complex questions rapidly. The technology could be applied in areas such as healthcare, for accurately diagnosing patients, to improve online self-service help desks, to provide tourists and citizens with specific information regarding cities, prompt customer support via phone and much more.”
Behind the scenes, Nuance Healthcare is putting Watson and other artificial-intelligence innovations into clinical boot camp, preparing them to participate in one of the most important jobs there is: saving lives. But before Watson can transition from quiz show to patient care, there will be an emergence of language-understanding technologies that are specially purposed to power Watson and better leverage medical data.
The IBM Power 750, the same system that powers Watson, has been enhanced with several options, including a faster POWER7 processor to handle the most challenging analytics workloads.
The next innovation
If the processing and mining of clinical information can be automated, information suddenly becomes much more valuable. Copious blocks of text can unveil clinical facts that will auto-populate an electronic health record (EHR) template or prompt a physician for additional detail. By putting this kind of technology into the hands of physicians and healthcare organizations, the adoption of EHRs, the achievement of meaningful use, the construction of an information repository for core measures and the unwieldy management of thousands of medical codes (for conversion to ICD-10) becomes achievable more easily. With high-powered, clinical language-understanding technologies, healthcare organizations can abstract discrete data from physicians’ narratives, free-text reports and other data sources in real time, while the patient is still in the hospital. This data-driven technology will make information access not only better, but automated and part of the care process.
IBM’s Watson computer system, powered by IBM POWER7, competes against two of the most successful and celebrated “Jeopardy!” contestants, Ken Jennings and Brad Rutter.
Imagine the value of technology that could prompt for additional information as a doctor is speaking; a solution that would interact with physicians at the point of documentation to ensure each medical record contains the level of specificity required to achieve ICD-10 reimbursement. This technology, computer-assisted physician documentation, is in development now. This solution will make documentation more productive for the physician, eliminate redundant queries from coding teams and drive the creation of more detailed patient records.
Clinical decision support
Patients’ medical records are extremely complex, with varying treatments, medications and outcomes. The way Watson could be applied to handle these complexities is by creating data repositories of patient information, which could then be mined for valuable information that ultimately could be leveraged as knowledge for use in treating patients with similar symptoms and care scenarios. Watson can be meaningfully applied anyplace in healthcare where there is an intersection between structured and unstructured data and/or where a sophisticated question is being asked. By leveraging data repositories (big knowledge bases), Watson is extremely good at extracting knowledge from mass amounts of structured and unstructured data that otherwise would be near impossible to break down and assess. Additionally, through natural language processing (NLP) capabilities, Watson is purpose-built to understand highly complex questions and process them in a way that delivers back highly relevant, evidence-based information and answers.
Watson, powered by IBM POWER7, is a work-load optimized system that can answer questions posed in natural language over a nearly unlimited range of knowledge.
In order for Watson to be utilized in healthcare, however, there is enormous work that must be done on the annotation front, which involves educating Watson on what’s important (see sidebar). For example, Watson needs to be guided on how to identify medications, details about treatment protocols, patient outcomes and more. Once Watson comes to know what is important and critical, it will be positioned well to make evidence-based clinical recommendations with high levels of confidence. Specific areas where Watson will likely be applied in healthcare include oncology, where it can be leveraged as an aid to figure out the treatment plan for a patient, as well as in multiple areas for clinical decision support and for differential diagnosis. In general, Watson will be used to support physicians’ decisions by providing real-time evidence that backs up their approach moving toward.
There are a number of joint development agreements between Nuance and IBM that have been announced over the years. One of the areas that we’re working with them on is the general advancement of the state-of-the-art voice recognition and voice technologies – the way a consumer would use it, making advancements in things like noise filtering, etc. Basically, together, we’re working to tackle some of the long-standing problems that have impacted the quality and utility of voice-driven solutions across various markets and use cases. With IBM, we’re also advancing retrieving information from structured information sources; this is being accomplished through various forms of advanced NLP projects. And thirdly, we’re collaborating with IBM on Watson where Nuance is contributing development and research resources to advance our own set of Watson-appropriate use models within healthcare, as well as to advance use models in healthcare that IBM is exploring. Our shared work involves a superset of technologies, including speech, NLP and reasoning solutions that we’re working to advance and ultimately bring to market.
A radiologist reviews a patient’s diagnostic image and simultaneously speaks review notes for the ordering physician, delivering fully detailed analysis on exam results.
INSET: A clinician dictates patient notes directly into an EHR. Words appear in real time, allowing her to make corrections if needed via voice to finalize notes without additional processes.
“What’s that you say?”
One of the hurdles that must be overcome in order to make Watson-driven clinical decision support a reality in healthcare is for Watson to understand exactly what a physician is asking. Given the complexity of patient care, physicians’ questions are complex and oftentimes multi-dimensional – they’re very seldom simple. Through a combination of voice and NLP capabilities, Watson will be able to figure out exactly what was asked, the intent of what was asked and will be able to swiftly identify the most relevant information associated with the question, as well as with the answer to the question. From a medical perspective, understanding what a physician is asking, whether it be about an adverse drug reaction vs. a treatment vs. dosing requirements vs. trying to draw analytics from a data repository, is extremely important to the ultimate value Watson will deliver to healthcare. At Nuance, we’re focused on the “speech-to-understanding-to-reasoning” component, and without that initial recognition and understanding, we’ll never get to meaningful, high-value reasoning.
Nuance leverages its own NLP capabilities to hone and improve its speech-recognition accuracy by applying rules-based techniques, as well as statistical and stochastic techniques. We believe that state-of-the-art voice recognition is achieved through a combination of these things. We’re evolving our recognition solutions beyond simply recognizing the spoken word to being able to understand intent based on certain words that are said, as well as surrounding words that are spoken. With recognition, we’ve already achieved out-of-the-box levels at the 95 percent mark and above. Now, a major focus is to further evolve recognition toward understanding and intelligent reasoning. Every word has an audio signature, and we’ve purpose-built our solutions to map each word (acoustic profile) to a massive library of words that we own and will continue to grow through our millions of users, as well as through data processed by our NLP engines.
A team of clinicians leverages Dragon Medical Mobile Search, a mobile app that allows users to simply speak a request to conduct fast and easy searches on various medical websites such as MedScape, MedLine, Epocrates and Google.
Looking toward the future
Advanced voice- and language-understanding components that are Watson-like or have been created in tandem with Watson efforts will begin to make their way into products before the end of 2012. You can imagine a Watson-powered smartphone app where a physician could ask for treatment protocols, drug information for migraine headaches, information on clinical trials and more. Nuance has a whole series of understanding and reasoning products that will be introduced this year.
Innovation unlocks information and value
The whole world seems to be excited about Watson, but Watson performing on television vs. Watson performing in a clinical scenario are two very different things. In the not-so-distant future, the healthcare industry will experience what so many other industries have: a technology evolution. Doctors considering a patient’s diagnosis will have a tool at their fingertips or a voice command away to aid in decision making, instantly accessing reference materials, prior cases and the latest knowledge in medical journals. Healthcare organizations will be able to easily uncover inefficiencies and reduce readmissions through real-time access to key information from past discharge summaries and trends across patient populations. Watson-like technologies will change healthcare by helping medical professionals confidently determine the best course of action.
Understanding the many players and realizing there is a difference between data, information and intelligence will be the keys to solving our healthcare system’s many problems. We have the technology but now must leverage it. Paying close attention to how information is captured, how it is understood and how it is applied will be critical to navigating ahead.
About the Author
Q&A with Big Blue:
HMT asked IBM Research medical scientist Dr. Josko Silobrcic, M.D., the following questions about Watson in healthcare:
Q: One of the ways Watson technology will be utilized in healthcare is in clinical decision support for diagnosing and treating clinical conditions. Can you describe in detail how that process might take place?
A: Step 1 – Question Analysis: As in all its applications, the Watson-enabled system will parse and analyze the question’s structured and unstructured (textual) data, to determine what type of inquiry is being pursued and what precisely it is looking for. In the clinical context, the “question” is represented by all the information that is known about the patient – ideally, comprehensively obtained from various electronic sources.
Step 2 – Hypothesis Generation: The Watson-enabled solution will look through vast and diverse medical sources that allow it to build a comprehensive list of possible answers.
Step 3 – Hypothesis and Evidence Scoring: The solution will find relevant passages from a vast number and array of medical evidence sources and use hundreds of complex algorithms to, in parallel, score the candidate answers in terms of their degree of probability.
Step 4 – Final Merging and Ranking: The Watson-enabled solution will use the training/learning experience it gains and refines over time to appropriately select, weigh and combine the algorithms that it has determined work best for this question.
Step 5 – Results Review and Refinement: As output, the clinician will see a list of suggested diagnoses and/or treatments, displayed with their likelihood/confidence – reflecting the combined strength of available evidence that supports them. The solution will be able to suggest obtaining additional information that may increase the likelihood/confidence of diagnoses. When this missing information is provided (i.e., when it becomes available in the patient’s electronic records automatically, or is simply entered by the user), the system will cycle back through its process steps and return a more-confident set of diagnostic or therapeutic suggestions.
Q: Watson’s ability to understand nuance (no pun intended) was often touted on “Jeopardy!” Will that ability help with parsing the physician notes to reach a diagnosis of condition and lay out treatment plan options?
A: It will. The Watson system’s use of sophisticated natural language processing (NLP) will be leveraged in both handling and interpreting textual data inputs, such as physician notes, and in specifically matching patient information with the most pertinent reference sources, such as medical journals, treatment guidelines and the like. The system’s ability to detect and interpret the subtle differences and meaning in any text associated with the patient may make the difference in whether it can identify the most personalized evidence for that patient or not. Sophisticated textual analysis may “understand” that “cut down from two to one pack a day” in a patient’s notes means the patient is currently a smoker (which may impact both the diagnostic and treatment decisions for that person), even if this fact is not explicitly recorded anywhere else in the patient’s records. Therefore, it will be able to discover and interpret even the most complex and nuanced data and evidence.
Q: As a physician, what do you think will be the challenges of accommodating the Watson system in the physician workflow?
A: Watson-enabled solutions will never make patient-care decisions, but only inform them. Thus, one of the key initial challenges actually lies in properly managing expectations about the technology itself, as well as its potential impact on the workflow. In considering workflow integration requirements, it is very possible that a physician and her/his patient may be reviewing some suggestions provided by a Watson-enabled solution together, and/or even jointly providing some additional information to the system. Moreover, different activities in using a Watson-enabled solution may be shared with other members of the care team, and may thus also be distributed over and integrated into their workflows.
Q: Can you describe some other areas of healthcare IT where the Watson technology could prove useful?
A: While IT in clinical diagnosis and treatment immediately come to mind for many, the array of healthcare areas where the Watson technology could provide important value is almost boundless. You can simply start by thinking of all the circumstances where important healthcare-relevant information may currently be “locked” in vast amounts of continuously expanding and evolving textual data. Or, you can think of any situation in healthcare where someone needs to access and interpret large and diverse quantities of textual and other information/evidence. Finally, Watson technology could be used by everyone from patients to various clinicians, healthcare researchers and biomedical scientists.