About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Mobile Multimedia Communications. 15th EAI International Conference, MobiMedia 2022, Virtual Event, July 22-24, 2022, Proceedings

Research Article

Research on Music Genre Classification Based on Residual Network

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-23902-1_16,
        author={Zhongwei Xu and Yuan Feng and Shengyu Song and Yuanxiang Xu and Ruiying Wang and Lan Zhang and Jiahao Liu},
        title={Research on Music Genre Classification Based on Residual Network},
        proceedings={Mobile Multimedia Communications. 15th EAI International Conference, MobiMedia 2022, Virtual Event, July 22-24, 2022, Proceedings},
        proceedings_a={MOBIMEDIA},
        year={2023},
        month={2},
        keywords={Residual Network (ResNet) MobileNet Depth learning Short Fourier Transform (STFT) Mel Frequency Cepstral Coefficient (MFCC)},
        doi={10.1007/978-3-031-23902-1_16}
    }
    
  • Zhongwei Xu
    Yuan Feng
    Shengyu Song
    Yuanxiang Xu
    Ruiying Wang
    Lan Zhang
    Jiahao Liu
    Year: 2023
    Research on Music Genre Classification Based on Residual Network
    MOBIMEDIA
    Springer
    DOI: 10.1007/978-3-031-23902-1_16
Zhongwei Xu1, Yuan Feng1,*, Shengyu Song1, Yuanxiang Xu1, Ruiying Wang2, Lan Zhang1, Jiahao Liu1
  • 1: College of Information Science and Engineering, Ocean University of China
  • 2: Teaching Center of Fundamental Courses, Ocean University of China
*Contact email: fengyuan@ouc.edu.cn

Abstract

With the rapid development of information technology, the number of songs is exploding, so the classification of music genres is a very challenging task, and at this stage, the implementation of automated classification of music genres is also a relatively popular scientific research topic. Mobile devices are all over people’s lives and have brought great convenience to people’s life and work, making it possible to work anywhere and anytime. However, the special characteristics of mobile devices require high model requirements, which are difficult to be realized by traditional models. We hope to use deep learning to automatically identify and classify music, and use the Mobilenet model to achieve lightweight music classification on mobile and improve the classification accuracy. In this paper, we mainly use Free Music Archive dataset for experiments, based on resnet101 network model and MobileNet model for music genre classification, mainly use Short Time Fourier Transform (STFT) and Mel Frequency Cepstrum Coefficient (MFCC) for music feature extraction, improve the data pre-processing, and compare with other model methods were compared, and the accuracy rate was about 7% higher than the traditional CRNN method, and better results were achieved. On the implementation of the lightweight model for mobile, the size of the parameters of the model trained by MobileNet is only 4% of the best model in this paper, and has a high accuracy rate.

Keywords
Residual Network (ResNet) MobileNet Depth learning Short Fourier Transform (STFT) Mel Frequency Cepstral Coefficient (MFCC)
Published
2023-02-01
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-23902-1_16
Copyright © 2022–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL