About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Wireless Mobile Communication and Healthcare. 11th EAI International Conference, MobiHealth 2022, Virtual Event, November 30 – December 2, 2022, Proceedings

Research Article

Harnessing the Role of Speech Interaction in Smart Environments Towards Improved Adaptability and Health Monitoring

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-32029-3_24,
        author={F\^{a}bio Barros and Ana Rita Valente and Ant\^{o}nio Teixeira and Samuel Silva},
        title={Harnessing the Role of Speech Interaction in Smart Environments Towards Improved Adaptability and Health Monitoring},
        proceedings={Wireless Mobile Communication and Healthcare. 11th EAI International Conference, MobiHealth 2022, Virtual Event, November 30 -- December 2, 2022, Proceedings},
        proceedings_a={MOBIHEALTH},
        year={2023},
        month={5},
        keywords={speech interaction nonverbal speech features health monitoring eHealth multimodal architectures},
        doi={10.1007/978-3-031-32029-3_24}
    }
    
  • Fábio Barros
    Ana Rita Valente
    António Teixeira
    Samuel Silva
    Year: 2023
    Harnessing the Role of Speech Interaction in Smart Environments Towards Improved Adaptability and Health Monitoring
    MOBIHEALTH
    Springer
    DOI: 10.1007/978-3-031-32029-3_24
Fábio Barros1, Ana Rita Valente1, António Teixeira1, Samuel Silva1,*
  • 1: Institute of Electronics and Informatics Engineering of Aveiro (IEETA)
*Contact email: sss@ua.pt

Abstract

The way we communicate with speech goes far beyond the words we use and nonverbal cues play a pivotal role in, e.g., conveying emphasis or expressing emotion. Furthermore, speech can also serve as a biomarker for a range of health conditions, e.g., Alzheimer’s. With a strong evolution of speech technologies, in recent years, speech has been increasingly adopted for interaction with machines and environments, e.g., our homes. While strong advances are being made in capturing the different verbal and nonverbal aspects of speech, the resulting features are often made available in standalone applications and/or for very specific scenarios. Given their potential to inform adaptability and support eHealth, it’s desirable to increase their consideration as an integral part of interactive ecosystems taking profit of the rising role of speech as an ubiquitous form of interaction. In this regard, our aim is to propose how this integration can be performed in a modular and expandable manner. To this end, this work presents a first reflection on how these different dimensions may be considered in the scope of a smart environment, through a seamless and expandable integration around speech as an interaction modality by proposing a first iteration of an architecture to support this vision and a first implementation to show its feasibility and potential.

Keywords
speech interaction nonverbal speech features health monitoring eHealth multimodal architectures
Published
2023-05-14
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-32029-3_24
Copyright © 2022–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL