This book includes a plain text version that is designed for high accessibility. To use this version please follow this link.
Speech Recognition


Five ways speech will


substantial amount of a clinician’s time is con- sumed by administrative tasks. While important to the care process, these tasks are invisible to patients and payers and can pose an added burden to a resource-challenged healthcare system. In a recent study published in the Arch Intern Med, the author documented 70 orders placed; 30 prescriptions written; 19 clinical notes reviewed, edited and signed; and 15 dictations – totaling close to 60 minutes of dictation each day. To maximize the efficiency and derive the most value from our resources, we need to minimize the effort and time taken to carry out these tasks. Speech and its associated cousins of natural language processing (NLP) and clinical language understanding (CLU) technology are boosting existing tech- nological innovations to let clinicians focus more on direct patient care and less on documentation.


Here are five innovations coming to healthcare that will take the meaning of efficiency and documentation to a whole new level with core speech and NLP/CLU technology:


transform medicine A


Recent advancements will expand speech recognition opportunities. By Dr. Nick van Terheyden


1. Cloud-based speech recognition


A recent survey issued by Spyglass Consulting Group states that 98 percent of physicians use mobile devices in their personal and professional lives. Speech recognition has long been tied to the desktop, but healthcare is now moving to a more mobile platform, freed from the shackles of a desktop or computer on wheels (COWs).


This is not just about convenience for the clinical team, but also about the move to more direct bedside capture of information, removing the opportunity for error or omission. Until now, speech recognition on mobile devices has been difficult, but the move to the cloud opens the door to an unte- thered clinical workflow. Any mobile-connected device with a microphone now offers both on-the-go access to current patient clinical data and the ability for the clinician to record his or her notes. The medical decision-making process is now directly connected through the device and patient record.


2. Navigation and control using speech intelligence Building off of the availability of speech in the cloud, physicians are able to utilize voice as the method to navigate the clinical systems. Apple’s Siri showed the potential for speech by permitting the technology to use location, date, time, and calendar and contact data as inputs to the voice commands. By accessing similar data in a clinical setting, we can offer clinicians improved efficiencies at carrying out some of the more time-consuming administrative tasks. Speech intelligence allows for a simple command, such as, “Prescribe enalapril 10 mg daily, metformin 10 mg daily and order a hemoglobin A1C test,” to enter the data for the two prescriptions and enter the order for a lab test for the patient.


3. Cloud-based medical intelligence We’ll start to see intelligent voice solutions equipped with the ability to access patient medical records, the context of the disease and knowledge base systems, including evidence- based medicine (EBM) and clinical decision-support systems


22 November 2012 HEALTH MANAGEMENT TECHNOLOGY www.healthmgttech.com Nick van Terheyden, M.D.,


is CMIO, Nuance. For more on Nuance: www.rsleads. com/211ht-208


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36