About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Advanced Hybrid Information Processing. Third EAI International Conference, ADHIP 2019, Nanjing, China, September 21–22, 2019, Proceedings, Part II

Research Article

Asynchronous Distributed ADMM for Learning with Large-Scale and High-Dimensional Sparse Data Set

Download(Requires a free EAI acccount)
2 downloads
Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-030-36405-2_27,
        author={Dongxia Wang and Yongmei Lei},
        title={Asynchronous Distributed ADMM for Learning with Large-Scale and High-Dimensional Sparse Data Set},
        proceedings={Advanced Hybrid Information Processing. Third EAI International Conference, ADHIP 2019, Nanjing, China, September 21--22, 2019, Proceedings, Part II},
        proceedings_a={ADHIP PART 2},
        year={2019},
        month={11},
        keywords={GA-ADMM General form consensus Bounded asynchronous Non-convex},
        doi={10.1007/978-3-030-36405-2_27}
    }
    
  • Dongxia Wang
    Yongmei Lei
    Year: 2019
    Asynchronous Distributed ADMM for Learning with Large-Scale and High-Dimensional Sparse Data Set
    ADHIP PART 2
    Springer
    DOI: 10.1007/978-3-030-36405-2_27
Dongxia Wang1, Yongmei Lei1,*
  • 1: School of Computer Engineering and Science of Shanghai University, No. 333, Nanchen Road, Baoshan District
*Contact email: lei@shu.edu.cn

Abstract

The distributed alternating direction method of multipliers is an effective method to solve large-scale machine learning. At present, most distributed ADMM algorithms need to transfer the entire model parameter in the communication, which leads to high communication cost, especially when the features of model parameter is very large. In this paper, an asynchronous distributed ADMM algorithm (GA-ADMM) based on general form consensus is proposed. First, the GA-ADMM algorithm filters the information transmitted between nodes by analyzing the characteristics of high-dimensional sparse data set: only associated features, rather than all features of the model, need to be transmitted between workers and the master, thus greatly reducing the communication cost. Second, the bounded asynchronous communication protocol is used to further improve the performance of the algorithm. The convergence of the algorithm is also analyzed theoretically when the objective function is non-convex. Finally, the algorithm is tested on the cluster supercomputer “Ziqiang 4000”. The experiments show that the GA-ADMM algorithm converges when appropriate parameters are selected, the GA-ADMM algorithm requires less system time to reach convergence than the AD-ADMM algorithm, and the accuracy of these two algorithms is approximate.

Keywords
GA-ADMM General form consensus Bounded asynchronous Non-convex
Published
2019-11-29
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-030-36405-2_27
Copyright © 2019–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL