Skip to content
/ Orca Public

Code repository for the SIGMOD2023 paper "Orca: Scalable Temporal Graph Neural Network Training with Theoretical Guarantees".

Notifications You must be signed in to change notification settings

LuckyLYM/Orca

Folders and files

NameName
Last commit message
Last commit date

Latest commit

2f6d5b4 · Apr 14, 2023

History

22 Commits
Feb 25, 2023
Apr 14, 2023
Feb 25, 2023
Apr 14, 2023
Apr 14, 2023
Apr 11, 2023
Apr 14, 2023

Repository files navigation

Orca: Scalable Temporal Graph Neural Network Training with Theoretical Guarantees

Dataset

6 datasets were used in this paper:

Preprocessing

If edge features or nodes features are absent, they will be replaced by a vector of zeros. Example usage:

python utils/preprocess_data.py --data wikipedia --bipartite
python uitls/preprocess_custom_data.py --data superuser

Requirements

  • PyTorch 1.7.1
  • Python 3.8
  • Numba 0.54.1

Usage

Optional arguments:
    --data                  Dataset name
    --bs                    Batch size
    --n_degree              Number of neighbors to sample
    --n_head                Number of heads used in attention layer
    --n_epoch               Number of epochs
    --n_layer               Number of network layers
    --lr                    Learning rate
    --gpu                   GPU id
    --patience              Patience for early stopping
    --enable_random         Use random seeds
    --gradient              Disable gradient blocking
    --reuse                 Enable caching and reuse
    --budget                Cache size

    
Example usage:
    python train.py --n_epoch 50 --n_layer 2 --bs 200 -d wikipedia  --enable_random --reuse --lr 1e-4 --gpu 1
    python train.py --n_epoch 50 --n_layer 2 --bs 200 -d askubuntu  --enable_random --reuse --budget 1000 --lr 1e-7 --gpu 1

About

Code repository for the SIGMOD2023 paper "Orca: Scalable Temporal Graph Neural Network Training with Theoretical Guarantees".

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages