About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Collaborative Computing: Networking, Applications and Worksharing. 19th EAI International Conference, CollaborateCom 2023, Corfu Island, Greece, October 4-6, 2023, Proceedings, Part III

Research Article

An Evolving Transformer Network Based on Hybrid Dilated Convolution for Traffic Flow Prediction

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-54531-3_18,
        author={Qi Yu and Weilong Ding and Maoxiang Sun and Jihai Huang},
        title={An Evolving Transformer Network Based on Hybrid Dilated Convolution for Traffic Flow Prediction},
        proceedings={Collaborative Computing: Networking, Applications and Worksharing. 19th EAI International Conference, CollaborateCom 2023, Corfu Island, Greece, October 4-6, 2023, Proceedings, Part III},
        proceedings_a={COLLABORATECOM PART 3},
        year={2024},
        month={2},
        keywords={Traffic flow prediction Transformer Hybrid dilated convolution Time series Attention mechanism},
        doi={10.1007/978-3-031-54531-3_18}
    }
    
  • Qi Yu
    Weilong Ding
    Maoxiang Sun
    Jihai Huang
    Year: 2024
    An Evolving Transformer Network Based on Hybrid Dilated Convolution for Traffic Flow Prediction
    COLLABORATECOM PART 3
    Springer
    DOI: 10.1007/978-3-031-54531-3_18
Qi Yu1, Weilong Ding1,*, Maoxiang Sun1, Jihai Huang2
  • 1: School of Information Science and Technology
  • 2: Zhengzhou University of Technology
*Contact email: dingweilong@ncut.edu.cn

Abstract

Decision making based on predictive traffic flow is one of effective solutions to relieve road congestion. Capturing and modeling the dynamic temporal relationships in global data is an important part of the traffic flow prediction problem. Transformer network has been proven to have powerful capabilities in capturing long-range dependencies and interactions in sequences, making it widely used in traffic flow prediction tasks. However, existing transformer-based models still have limitations. On the one hand, they ignore the dynamism and local relevance of traffic flow time series due to static embedding of input data. On the other hand, they do not take into account the inheritance of attention patterns due to the attention scores of each layer’s are learned separately. To address these two issues, we propose an evolving transformer network based on hybrid dilated convolution, namely HDCformer. First, a novel sequence embedding layer based on dilated convolution can dynamically learn the local relevance of traffic flow time series. Secondly, we add residual connections between attention modules of adjacent layers to fully capture the evolution trend of attention patterns between layers. Our HDCformer is evaluated on two real-world datasets and the results show that our model outperforms state-of-the-art baselines in terms of MAE, RMSE, and MAPE.

Keywords
Traffic flow prediction Transformer Hybrid dilated convolution Time series Attention mechanism
Published
2024-02-23
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-54531-3_18
Copyright © 2023–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL