About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Advanced Hybrid Information Processing. First International Conference, ADHIP 2017, Harbin, China, July 17–18, 2017, Proceedings

Research Article

Performance Analysis of Sparsity-Penalized LMS Algorithms in Channel Estimation

Download(Requires a free EAI acccount)
273 downloads
Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-319-73317-3_47,
        author={Jie Yang and Hao Huang and Jie Wang and Sheng Hong and Zijian Hua and Jian Zhang and Guan Gui},
        title={Performance Analysis of Sparsity-Penalized LMS Algorithms in Channel Estimation},
        proceedings={Advanced Hybrid Information Processing. First International Conference, ADHIP 2017, Harbin, China, July 17--18, 2017, Proceedings},
        proceedings_a={ADHIP},
        year={2018},
        month={2},
        keywords={Gradient descent Least mean squares Sparse constraint Adaptive channel estimation Compressive sensing},
        doi={10.1007/978-3-319-73317-3_47}
    }
    
  • Jie Yang
    Hao Huang
    Jie Wang
    Sheng Hong
    Zijian Hua
    Jian Zhang
    Guan Gui
    Year: 2018
    Performance Analysis of Sparsity-Penalized LMS Algorithms in Channel Estimation
    ADHIP
    Springer
    DOI: 10.1007/978-3-319-73317-3_47
Jie Yang1,*, Hao Huang1, Jie Wang1, Sheng Hong1, Zijian Hua1, Jian Zhang1, Guan Gui1,*
  • 1: Nanjing University of Posts and Telecommunications
*Contact email: jyang@njupt.edu.cn, guiguan@njupt.edu.cn

Abstract

Least mean squares (LMS) algorithm was considered as one of the effective methods in adaptive system identifications. Different from many unknown systems, LMS algorithm cannot exploit any structure characteristics. In case of sparse channels, sparse LMS algorithms are proposed to exploit channel sparsity and thus these methods can achieve better estimation performance than standard one, under the assumption of Gaussian noise environment. Specifically, several sparse constraint functions, -norm, reweighted -norm and -norm, are developed to take advantage of channel sparsity. By using different sparse functions, these proposed methods are termed as zero-attracting LMS (ZA-LMS), reweighted ZA-LMS (RZA-LMS), reweighted -norm LMS (RL1-LMS) and -norm LMS (LP-LMS). Our simulation results confirm the priority of the new algorithm and show that the proposed sparse algorithms are superior to the standard LMS in number scenarios.

Keywords
Gradient descent Least mean squares Sparse constraint Adaptive channel estimation Compressive sensing
Published
2018-02-09
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-319-73317-3_47
Copyright © 2017–2025 EAI
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL