Research Article
Multimodal Data Analysis and Visualization to Study the Usage of Electronic Health Records
@INPROCEEDINGS{10.4108/icst.pervasivehealth.2013.252025, author={Nadir Weibel and Shazia Ashfaq and Alan Calvitti and James Hollan and Zia Agha}, title={Multimodal Data Analysis and Visualization to Study the Usage of Electronic Health Records}, proceedings={7th International Conference on Pervasive Computing Technologies for Healthcare}, publisher={IEEE}, proceedings_a={PERVASIVEHEALTH}, year={2013}, month={5}, keywords={electronic health records multmodal visualization usability kinect}, doi={10.4108/icst.pervasivehealth.2013.252025} }
- Nadir Weibel
Shazia Ashfaq
Alan Calvitti
James Hollan
Zia Agha
Year: 2013
Multimodal Data Analysis and Visualization to Study the Usage of Electronic Health Records
PERVASIVEHEALTH
ICST
DOI: 10.4108/icst.pervasivehealth.2013.252025
Abstract
Understanding interaction with Electronic Health Records (EHR), often means to understand the multimodal nature of the physician-patient interaction, as well as the interaction with other materials (e.g. paper charts), in addition to analyze the tasks fulfilled by the doctor on his computerized system. Recent approaches started to analyze and quantify speech, gaze, body movements, etc. and represent a very promising way to complement classic software usability. However, it is hard to characterize multimodal activity, since often it requires manual coding of hours of video data. We present our approach to use automatic tracking of body, audio signals and gaze in the medical office to achieve multimodal analysis of EHR.
Copyright © 2013–2024 ICST