Skip to content

Wangyuwen0627/ACE-GLT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

[ECAI 2023] Adversarial Erasing with Pruned Elements: Towards Better Graph Lottery Tickets

License: Apache arXiv

Official codebase for paper Adversarial Erasing with Pruned Elements: Towards Better Graph Lottery Tickets. This codebase is based on the open-source DGL framework and please refer to that repo for more documentation.

Overview

Abstract: Graph Lottery Ticket (GLT), a combination of core subgraph and sparse subnetwork, has been proposed to mitigate the computational cost of deep Graph Neural Networks (GNNs) on large input graphs while preserving original performance. However, the winning GLTs in existing studies are obtained by applying iterative magnitude-based pruning (IMP) without re-evaluating and re-considering the pruned information, which disregards the dynamic changes in the significance of edges/weights during graph/model structure pruning, and thus limits the appeal of the winning tickets. In this paper, we formulate a conjecture, i.e., existing overlooked valuable information in the pruned graph connections and model parameters which can be re-grouped into GLT to enhance the final performance. Specifically, we propose an adversarial complementary erasing (ACE) framework to explore the valuable information from the pruned components, thereby developing a more powerful GLT, referred to as the ACE-GLT. The main idea is to mine valuable information from pruned edges/weights after each round of IMP, and employ the ACE technique to refine the GLT processing. Finally, experimental results demonstrate that our ACE-GLT outperforms existing methods for searching GLT in diverse tasks.

Prerequisites

Install dependencies

See requirement.txt file for more information about how to install the dependencies.

Usage

Please follow the instructions below to replicate the results in the paper.

Small-scale datasets

# baseline (non-pruned cases)
python small_scale/baseline.py --backbone <BACKBONE> --dataset <DATASET>

# ACE-GLT (GCN)
python small_scale/glt_gcn.py --dataset <dataset> --pruning_percent_wei 0.2 --pruning_percent_adj 0.05 --mask_epochs 200 --fix_epochs 200 --s1 1e-2 --s2 1e-2 --init_soft_mask_type all_one

# ACE-GLT (GIN)
python small_scale/glt_gin.py --dataset <dataset> --pruning_percent_wei 0.2 --pruning_percent_adj 0.05 --mask_epochs 200 --fix_epochs 200 --s1 1e-3 --s2 1e-3 --init_soft_mask_type all_one

# ACE-GLT (GAT)
python small_scale/glt_gat.py --dataset <dataset> --pruning_percent_wei 0.2 --pruning_percent_adj 0.05 --mask_epochs 200 --fix_epochs 200 --s1 1e-3 --s2 1e-3 --init_soft_mask_type all_one

Parameters:

  • --backbone : 'gcn' or 'gin' or 'gat'
  • --dataset : 'cora' or 'citeseer' or 'pubmed'

Large-scale datasets

# baseline (OGBN-arxiv)
python large_scale/ogbn_arxiv/baseline.py

# baseline (OGBN-proteins)
python large_scale/ogbn_proteins/baseline.py

# ACE-GLT (OGBN-arxiv)
python large_scale/ogbn_arxiv/glt_resgcn.py --use_gpu --self_loop --learn_t --num_layers 28 --block res+ --gcn_aggr softmax_sg --t 0.1 --s1 1e-6 --s2 1e-4 --pruning_percent_wei 0.2 --pruning_percent_adj 0.05 --mask_epochs 250 --fix_epochs 500 --model_save_path IMP

# ACE-GLT (OGBN-proteins)
python large_scale/ogbn_proteins/glt_resgcn.py --use_gpu --conv_encode_edge --use_one_hot_encoding --learn_t --num_layers 28 --s1 1e-1 --s2 1e-3 --pruning_percent_wei 0.2 --pruning_percent_adj 0.05 --mask_epochs 250 --fix_epochs 500 --model_save_path IMP

Citation

If you find this work useful for your research, please cite our paper:

@inproceedings{wang2023ACEGLT,
      title={Adversarial Erasing with Pruned Elements: Towards Better Graph Lottery Ticket}, 
      author={Yuwen Wang and Shunyu Liu and Kaixuan Chen and Tongtian Zhu and Ji Qiao and Mengjie Shi and Yuanyu Wan and Mingli Song},
      booktitle={European Conference on Artificial Intelligence},
      year={2023}
}

Contact

Please feel free to contact me via email ([email protected]) if you have any questions about our work.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages