In recent years, medical advances have made the impossible possible and we are nowadays able to diagnose and cure diseases with tools ranging from specially designed molecules to lasers. Nonetheless, many aspects in everyday clinical care still lack improvement and are prone to many errors. With upcoming tools like Google Glass, which have the potential to shift current technology from a stationary or handled device to wearable technology, medicine is anticipating another revolution in devices and procedures.
This new technology will allow users to constantly interact with medical software to make more informed choices, ease documentation hassle and get instantaneous feedback. Overall, this will be a major boost to clinical care. New bodywear information technologies are increasingly available. Industry has already identified the health care sector as an important field of application. For example, Qualcomm and Palomar Health recently launched a glassware medical incubator named Glassomics to explore potential usages of wearable technology in the medical environment.
A key task in the medical sector is medication prescription, preparation and administration. Studies have shown that errors occurred in up to almost 50% of intravenous medication preparations and administrations with one of the biggest problems being the preparation of drugs that require multiple steps. Additionally, dosage is often a problem and life threatening overdoses have been observed in 0.5 % of the cases. Medication preparation and administration is therefore a good example of a clinical workflow and well worth improving. Furthermore, computer based decision support for medication has matured in the last decade especially in the context of computerized provider order entry (CPOE).
We propose a novel approach to use wearable technology for the improvement of medication administration. One challenging aspect of the proposed integration of wearable technology into existing workflows is the reliable detection of the current task performed by the wearer. Research results focus on that problem for more than a decade and present promising approaches for sensor-based activity recognition. Early adoptions of wearable technology in the clinic already detect specific activities of all-day work and, thus, provide positive indications for feasibility.
The prototype client software uses the built-in camera to automatically detect barcodes in the current field of view. QR-codes are used in the prototype application, as they are easy to create and process. However, almost any kind of barcode can be adopted. The barcodes encode identifiers of the patient and the medication, respectively. The identifiers consist of a leading letter (P for patient and M for medication), and a trailing unique number. As soon as both a patient and medication number have been detected, a request is send to the transaction server. The transaction server uses lookup tables to load patient and medication information. As a complete reasoning of medication interactions has not been implemented for the prototype, blacklists and whitelists of medication have been created for each patient. The server uses the identified patient and medication information to check for violations or prescriptions on the patient-specific lists. A feedback in traffic light style is sent back to the user:
- Green: Drug is marked as prescribed in the system and no contraindication exists. Medication is allowed.
- Yellow: Neither prescription nor contraindication was found. Medication should be reassured with a medical doctor.
- Red: A violation or contraindication has been detected. Medication is not permitted.
For each of the steps of the medical workflow, device interaction is needed . Two of these interactions consist of looking at barcodes that already exist today. First, the patient identification is performed by reading the patient ID from a barcode on the printed patient record. Second, the medication itself including dose information is retrieved by reading the barcode printed on the medication package during the preparation step. Both interactions do not intervene with the regular lookup of patient information and prescribed medication. Audiovisual feedback is presented upon successful scanning of a barcode. Immediately after having received both patient and medication information, the workflow transaction server checks the medication knowledge base for violations. If any violations are found, an alarm signal represented in traffic light style is displayed to the user: green means no violation; yellow means mild violation (e.g., possible overdosage); and red means critical violation (e.g., life-threatening overdose or inadvertent interactions with other medication). Detailed warning information can either be displayed directly on the Google Glass, or on a computer in the nursing station.
In a last step, a documentation of the action is submitted to the HIS in real time for further review, creating a record of possible medication administration. The documentation has to be verified by the clinical staff at a later point. The reapplication of the same medication leads to an adequate warning of the system.