
Research Article
GFTLSTM: Dynamic Graph Neural Network Model Based on Graph Framelets Transform
@INPROCEEDINGS{10.1007/978-3-031-55722-4_6, author={Shengpeng Yang and Siwei Zhou and Shasha Yang and Jiandong Shi}, title={GFTLSTM: Dynamic Graph Neural Network Model Based on Graph Framelets Transform}, proceedings={Intelligent Technologies for Interactive Entertainment. 14th EAI International Conference, INTETAIN 2023, Lucca, Italy, November 27, 2023, Proceedings}, proceedings_a={INTETAIN}, year={2024}, month={3}, keywords={graph representation learning dynamic graph graph framelets transform}, doi={10.1007/978-3-031-55722-4_6} }
- Shengpeng Yang
Siwei Zhou
Shasha Yang
Jiandong Shi
Year: 2024
GFTLSTM: Dynamic Graph Neural Network Model Based on Graph Framelets Transform
INTETAIN
Springer
DOI: 10.1007/978-3-031-55722-4_6
Abstract
There is currently a surge of interest in graph representation learning, with researchers increasingly focusing on methods and applications involving graph neural networks (GNNs). However, traditional GNN models for static graph data analysis have limitations in their ability to extract evolutionary patterns from dynamic graphs, which are more commonly observed in real-world data. Additionally, existing dynamic GNN models tend to prioritize low-frequency information while neglecting high-frequency information. To address these limitations, we introduce a dynamic graph neural network model that leverages graph framelet transforms, capitalizing on the benefits of traditional wavelets in multi-resolution analysis. The initial step involves constructing the graph framelet transform based on graph wavelet research, subsequently implementing multi-resolution graph convolution with both low-pass and high-pass filtering. Then, we incorporate the convolution operation into a long short-term memory (LSTM) network. As a result, we develop a dynamic GNN model founded on the graph framelet transform, which effectively uncovers the evolutionary information embedded within dynamic graphs. In our experiment evaluation, we compare our model with 11 widely used dynamic graph representation learning algorithms across three public datasets of discrete dynamic graph representation learning tasks (encompassing six groups of experimental data). Our model outperforms the alternatives in terms of accuracy on the majority of the datasets.