This is a Pytorch implementation of the DiscoGNN paper:
We used the following Python packages for core development. We tested on Python 3.7
.
pytorch 1.0.1
torch-cluster 1.2.4
torch-geometric 1.0.3
torch-scatter 1.1.2
torch-sparse 0.2.4
torch-spline-conv 1.0.6
rdkit 2019.03.1.0
tqdm 4.31.1
tensorboardx 1.6
All the necessary data files can be downloaded from the following links.
For the chemistry dataset, download from chem data (2.5GB), unzip it, and put it under dataset/
.
python pretrain_disco.py --output_model_file OUTPUT_MODEL_PATH
This will save the resulting pre-trained model to OUTPUT_MODEL_PATH
.
python finetune.py --input_model_file INPUT_MODEL_PATH --dataset DOWNSTREAM_DATASET --filename OUTPUT_FILE_PATH
This will finetune pre-trained model specified in INPUT_MODEL_PATH
using dataset DOWNSTREAM_DATASET.
The result of fine-tuning will be saved to OUTPUT_FILE_PATH.
Our results in the paper can be reproduced using a random seed ranging from 0 to 9 with scaffold splitting.
[1] Strategies for Pre-training Graph Neural Networks (Hu et al., ICLR 2020)
@inproceedings{xia2024discognn,
title={DiscoGNN: A Sample-Efficient Framework for Self-Supervised Graph Representation Learning},
author={Xia, Jun and Chen, Shaorong and Liu, Yue and Gao, Zhangyang and Zheng, Jiangbin and Yang, Xihong and Li, Stan Z},
booktitle={2024 IEEE 40th International Conference on Data Engineering (ICDE)},
pages={2876--2888},
year={2024},
organization={IEEE}
}