We present Spectral Attention Networks, a power GNN that leverages key principles from spectral graph theory to enable full graph attention.
nets
contains the Node, Edge and no LPE architectures implemented with PyTorch.layers
contains the multi-headed attention employed by the Main Graph Transformer implemented in DGL.configs
contains the various parameters used in the ablation and SOTA comparison studiestrain
contains methods to train the modelsdocs
contains scripts from https://github.com/graphdeeplearning/graphtransformer to download datasets and setup environments.scripts
contains scripts to reproduce ablation and SOTA comparison results. Seescripts/reproduce.md
for details.