This program provides the implementation of our U2GNN as described in our paper, where we leverage the transformer self-attention network to construct an advanced aggregation function to learn graph representations.
- 17-05-2020: Update Pytorch (1.5.0) implementation.
- Python 3.x
- Tensorflow 1.14
- Tensor2tensor 1.13
- Networkx 2.3
- Scikit-learn 0.21.2
U2GNN$ python train_U2GNN_Sup.py --dataset IMDBBINARY --batch_size 4 --ff_hidden_size 1024 --fold_idx 1 --num_neighbors 8 --num_sampled 512 --num_epochs 50 --num_timesteps 4 --learning_rate 0.0005 --model_name IMDBBINARY_bs4_fold1_dro05_1024_8_idx0_4_1
U2GNN$ python train_U2GNN_Sup.py --dataset PTC --batch_size 4 --ff_hidden_size 1024 --fold_idx 1 --num_neighbors 16 --num_sampled 512 --num_epochs 50 --num_timesteps 3 --learning_rate 0.0005 --model_name PTC_bs4_fold1_dro05_1024_16_idx0_3_1
Please cite the paper whenever U2GNN is used to produce published results or incorporated into other software:
@article{Nguyen2019U2GNN,
author={Dai Quoc Nguyen and Tu Dinh Nguyen and Dinh Phung},
title={{Universal Graph Transformer Self-Attention Networks}},
journal={arXiv preprint arXiv:1909.11855},
year={2019}
}
As a free open-source implementation, U2GNN is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. All other warranties including, but not limited to, merchantability and fitness for purpose, whether express, implied, or arising by operation of law, course of dealing, or trade usage are hereby disclaimed. I believe that the programs compute what I claim they compute, but I do not guarantee this. The programs may be poorly and inconsistently documented and may contain undocumented components, features or modifications. I make no guarantee that these programs will be suitable for any application.
U2GNN is licensed under the Apache License 2.0.