
Research Article
Inverse Pyramid Pooling Attention for Ultrasonic Image Signal Recognition
@INPROCEEDINGS{10.1007/978-3-031-55471-1_8, author={Zhiwen Jiang and Ziji Ma and Xianglong Dong and Qi Wang and Xun Shao}, title={Inverse Pyramid Pooling Attention for Ultrasonic Image Signal Recognition}, proceedings={Mobile Networks and Management. 13th EAI International Conference, MONAMI 2023, Yingtan, China, October 27-29, 2023, Proceedings}, proceedings_a={MONAMI}, year={2024}, month={3}, keywords={Convolutional neural network Inverse pyramid attention Combinatorial pooling Ultrasound signal recognition}, doi={10.1007/978-3-031-55471-1_8} }
- Zhiwen Jiang
Ziji Ma
Xianglong Dong
Qi Wang
Xun Shao
Year: 2024
Inverse Pyramid Pooling Attention for Ultrasonic Image Signal Recognition
MONAMI
Springer
DOI: 10.1007/978-3-031-55471-1_8
Abstract
Ultrasound is commonly used for diagnosis and detection in a variety of fields, and the analysis of ultrasound echo signals presents a significant challenge in terms of the amount of time required by professionals to make subjective judgements. With the advances made in artificial intelligence technology on computers, more and more fields are being aided by it, not only increasing efficiency but also improving overall accuracy. In this paper, an inverse pyramid pooling of attention (IPPA) mechanism is proposed for images transformed from ultrasound echo signals. IPPA performs different pooling operations at multiple scale levels for each channel of the feature matrix, obtaining rich regional feature associations and thus improving the representation of the channels. In addition, different probability factors were assigned for the different pooling, and domain channel information was extracted by adaptive 1D convolution to enhance the adaptation range of the network model. Experimental results on a 10-class ultrasound hyperdata set (consisting of three sub-datasets) show that the sensitivity and robustness of the ResNet integrated with IPPA are improved over the original ResNet, with an accuracy of up to 99.68%.