
Research Article
Headmotion: Human-Machine Interaction Requires Only Your Head Movement
@INPROCEEDINGS{10.1007/978-3-031-04231-7_9, author={Duoteng Xu and Peizhao Zhu and Chuyu Zheng}, title={Headmotion: Human-Machine Interaction Requires Only Your Head Movement}, proceedings={Edge Computing and IoT: Systems, Management and Security. Second EAI International Conference, ICECI 2021, Virtual Event, December 22--23, 2021, Proceedings}, proceedings_a={ICECI}, year={2022}, month={5}, keywords={Human-computer interaction IMU Android Machine learning}, doi={10.1007/978-3-031-04231-7_9} }
- Duoteng Xu
Peizhao Zhu
Chuyu Zheng
Year: 2022
Headmotion: Human-Machine Interaction Requires Only Your Head Movement
ICECI
Springer
DOI: 10.1007/978-3-031-04231-7_9
Abstract
With increasing demand for smart wearable devices and the booming development of pervasive computing, more and more new human-computer interaction methods for wearable devices are proposed to make up for the shortcomings of traditional wearable device interaction methods and improve the efficiency and ubiquity of interaction.
Up to now, the human-computer interaction of wearable devices is still dominated by contact interactions, such as touching the screen and pressing physical buttons. This interaction method is convenient in most scenarios, but there are limitations in some situations, such as disabled people cannot use their hands for human-computer interaction, and drivers are not suitable for touch interaction with their hands when driving. To address this shortcoming, we designed a natural interaction system for ear-worn smart devices, which includes a deep network recognition model based on acceleration, angular velocity and angle data of head motion collected by inertial units worn on the ear. The interaction system now can realize human-computer interaction of head motion in various complex scenarios, which greatly liberates the user’s hands. We have experimentally verified the excellent accuracy and real-time performance of our designed system.