About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Green Energy and Networking. 7th EAI International Conference, GreeNets 2020, Harbin, China, June 27-28, 2020, Proceedings

Research Article

The Crawl and Analysis of Recruitment Data Based on the Distributed Crawler

Download(Requires a free EAI acccount)
5 downloads
Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-030-62483-5_18,
        author={Jiancai Wang and Jianting Shi},
        title={The Crawl and Analysis of Recruitment Data Based on the Distributed Crawler},
        proceedings={Green Energy and Networking. 7th EAI International Conference, GreeNets 2020, Harbin, China, June 27-28, 2020, Proceedings},
        proceedings_a={GREENETS},
        year={2020},
        month={11},
        keywords={Distributed crawler Scrapy framework Data processing},
        doi={10.1007/978-3-030-62483-5_18}
    }
    
  • Jiancai Wang
    Jianting Shi
    Year: 2020
    The Crawl and Analysis of Recruitment Data Based on the Distributed Crawler
    GREENETS
    Springer
    DOI: 10.1007/978-3-030-62483-5_18
Jiancai Wang1,*, Jianting Shi2
  • 1: Office of Academic Affairs, Hei Longjiang University of Science and Technology
  • 2: School of Computer and Information Engineering, Hei Longjiang University of Science and Technology
*Contact email: 154539860@qq.com

Abstract

Because of the rapid development of Internet, how to efficiently and quickly obtain useful data has become an importance. In this paper, a distributed crawler crawling system is designed and implemented to capture the recruitment data of online recruitment websites. The architecture and operation workflow of the Scrapy crawler framework is combined with Python, the composition and functions of Scrapy-Redis and the concept of data visualization. Echarts is applied on crawlers, which describes the characteristics of the web page where the employer publishes recruitment information. In the base of Scrapy framework, the middleware, proxy IP and dynamic UA are used to prevent crawlers from being blocked by websites. Data cleaning and encoding conversion is used to make data processing.

Keywords
Distributed crawler Scrapy framework Data processing
Published
2020-11-03
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-030-62483-5_18
Copyright © 2020–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL