Author: Fathy, Ahmed; Li, Kan
Title: TemporalGAT: Attention-Based Dynamic Graph Representation Learning Cord-id: b0l8gzke Document date: 2020_4_17
ID: b0l8gzke
Snippet: Learning representations for dynamic graphs is fundamental as it supports numerous graph analytic tasks such as dynamic link prediction, node classification, and visualization. Real-world dynamic graphs are continuously evolved where new nodes and edges are introduced or removed during graph evolution. Most existing dynamic graph representation learning methods focus on modeling dynamic graphs with fixed nodes due to the complexity of modeling dynamic graphs, and therefore, cannot efficiently le
Document: Learning representations for dynamic graphs is fundamental as it supports numerous graph analytic tasks such as dynamic link prediction, node classification, and visualization. Real-world dynamic graphs are continuously evolved where new nodes and edges are introduced or removed during graph evolution. Most existing dynamic graph representation learning methods focus on modeling dynamic graphs with fixed nodes due to the complexity of modeling dynamic graphs, and therefore, cannot efficiently learn the evolutionary patterns of real-world evolving graphs. Moreover, existing methods generally model the structural information of evolving graphs separately from temporal information. This leads to the loss of important structural and temporal information that could cause the degradation of predictive performance of the model. By employing an innovative neural network architecture based on graph attention networks and temporal convolutions, our framework jointly learns graph representations contemplating evolving graph structure and temporal patterns. We propose a deep attention model to learn low-dimensional feature representations which preserves the graph structure and features among series of graph snapshots over time. Experimental results on multiple real-world dynamic graph datasets show that, our proposed method is competitive against various state-of-the-art methods.
Search related documents:
Co phrase search for related documents- activation function and loss function: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
- adam optimizer and logistic regression: 1
- adam optimizer and loss function: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10
- adam optimizer model and logistic regression: 1
- adam optimizer model and loss function: 1, 2
- adam optimizer model train and logistic regression: 1
- adam optimizer model train and loss function: 1
- adjacency matrix and logistic regression: 1
- adjacency matrix and long range: 1
- logistic regression and long range: 1, 2, 3, 4, 5, 6, 7
- logistic regression and loss function: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10
Co phrase search for related documents, hyperlinks ordered by date