Skip to content

Must-read papers on graph foundation models (GFMs)

License

Notifications You must be signed in to change notification settings

zdlant/GFMPapers

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

53 Commits
 
 
 
 

Repository files navigation

GFMPapers: Must-read papers on graph foundation models (GFMs)

awesome PRs

This list is currently maintained by members in BUPT GAMMA Lab. If you like our project, please give us a star ⭐ on GitHub for the latest update.

We thank all the great contributors very much.

⭐[News] We will hold a tutorial about graph foundation model at the WebConf 2024! See you at Singapore!

Contents

Keywords Convention

backbone architecture

Pretraining

Adaptation

The meaning of each tag can be referred to in the "Towards Graph Foundation Models: A Survey and Beyond" paper.

0. Survey Papers

  1. [arXiv 2023.8] Graph Meets LLMs: Towards Large Graph Models. [pdf][paperlist]
  2. [arXiv 2023.10] Integrating Graphs with Large Language Models: Methods and Prospects. [pdf]
  3. [arXiv 2023.10] Towards Graph Foundation Models: A Survey and Beyond. [pdf][paperlist]
  4. [arXiv 2023.11] A Survey of Graph Meets Large Language Model: Progress and Future Directions. [pdf][paperlist]
  5. [arXiv 2023.12] Large Language Models on Graphs: A Comprehensive Survey. [pdf][paperlist]

1. GNN-based Papers

  1. [arXiv 2023.10] Enhancing Graph Neural Networks with Structure-Based Prompt [pdf]
  2. [arXiv 2023.11] MultiGPrompt for Multi-Task Pre-Training and Prompting on Graphs [pdf]
  3. [arXiv 2023.10] HetGPT: Harnessing the Power of Prompt Tuning in Pre-Trained Heterogeneous Graph Neural Networks [pdf]
  4. [arXiv 2023.10] Prompt Tuning for Multi-View Graph Contrastive Learning [pdf]
  5. [arXiv 2023.05] PRODIGY: Enabling In-context Learning Over Graphs. [pdf]
  6. [arXiv 2023.05] G-Adapter: Towards Structure-Aware Parameter-Efficient Transfer Learning for Graph Transformer Networks. [pdf]
  7. [arXiv 2023.04] AdapterGNN: Efficient Delta Tuning Improves Generalization Ability in Graph Neural Networks. [pdf]
  8. [arXiv 2023.02] SGL-PT: A Strong Graph Learner with Graph Prompt Tuning. [pdf]
  9. [KDD 2023] All in One: Multi-Task Prompting for Graph Neural Networks. [pdf]
  10. [KDD 2023] A Data-centric Framework to Endow Graph Neural Networks with Out-Of-Distribution Detection Ability. [pdf] [code]
  11. [AAAI 2023] Ma-gcl: Model augmentation tricks for graph contrastive learning. [pdf] [code]
  12. [WWW 2023] GraphMAE2: A Decoding-Enhanced Masked Self-Supervised Graph Learner. [pdf] [code]
  13. [WWW 2023] Graphprompt: Unifying pre-training and downstream tasks for graph neural networks. [pdf] [code]
  14. [CIKM 2023] Voucher Abuse Detection with Prompt-based Fine-tuning on Graph Neural Networks. [pdf] [code]
  15. [KDD 2022] GraphMAE: Self-supervised masked graph autoencoders. [pdf] [code]
  16. [KDD 2022] Gppt: Graph pre-training and prompt tuning to generalize graph neural networks.
  17. [arXiv 2022.09] Universal Prompt Tuning for Graph Neural Networks. [pdf]
  18. [KDD 2021] Pre-training on large-scale heterogeneous graph. [pdf] [code]
  19. [CIKM 2021] Contrastive pre-training of GNNs on heterogeneous graphs. [pdf] [code]
  20. [ICML 2020] Deep graph contrastive representation learning. [pdf] [code]
  21. [NeurIPS 2020] Self-supervised graph transformer on large-scale molecular data. [pdf]
  22. [NeurIPS 2020] Graph contrastive learning with augmentations. [pdf] [code]
  23. [KDD 2020] Gcc: Graph contrastive coding for graph neural network pre-training. [pdf] [code]
  24. [KDD 2020] Gpt-gnn: Generative pre-training of graph neural networks. [pdf] [code]
  25. [arXiv 2020.01] Graph-bert: Only attention is needed for learning graph representations. [pdf] [code]
  26. [ICLR 2019] Deep graph infomax. [pdf] [code]
  27. [arXiv 2016.11] Variational graph auto-encoders. [pdf] [code]

2. LLM-based Papers

  1. [arXiv 2023.10] Talk Like a Graph: Encoding Graphs for Large Language Models [pdf]
  2. [arxiv 2023.10] Graphtext: Graph reasoning in text space. [pdf]
  3. [arXiv 2023.09] Can LLMs Effectively Leverage Graph Structural Information: When and Why [pdf]
  4. [arXiv 2023.08] Natural language is all a graph needs. [pdf] [code]
  5. [arxiv 2023.08] Evaluating large language models on graphs: Performance insights and comparative analysis. [pdf] [code]
  6. [arxiv 2023.07] Can large language models empower molecular property prediction? [pdf] [code]
  7. [arxiv 2023.07] Meta-Transformer: A Unified Framework for Multimodal Learning. [pdf] [code]
  8. [arxiv 2023.07] Exploring the potential of large language models (llms) in learning on graphs [pdf] [code]
  9. [arxiv 2023.05] Gimlet: A unified graph-text model for instruction-based molecule zero-shot learning. [pdf]
  10. [arxiv 2023.05] Can language models solve graph problems in natural language? [pdf] [code]
  11. [arxiv 2023.05] Gpt4graph: Can large language models understand graph structured data? an empirical evaluation and benchmarking [pdf]

3. GNN+LLM-based Papers

  1. [arXiv 2023.10] Label-free Node Classification on Graphs with Large Language Models (LLMs) [pdf]
  2. [arXiv_2023.09] One for All: Towards Training One Graph Model for All Classification Tasks [pdf]
  3. [arXiv_2023.09] Prompt-based Node Feature Extractor for Few-shot Learning on Text-Attributed Graphs.[pdf]
  4. [arxiv 2023.08] Simteg: A frustratingly simple approach improves textual graph learning. [pdf]
  5. [arxiv 2023.05] Explanations as features: Llm-based features for text-attributed graphs. [pdf]
  6. [arxiv 2023.05] Congrat: Self-supervised contrastive pretraining for joint graph and text embeddings. [pdf]
  7. [arxiv 2023.04] Train your own GNN teacher: Graph-aware distillation on textual graphs. [pdf]
  8. [arxiv 2023.04] Graph-toolformer: To empower llms with graph reasoning ability via prompt augmented by chatgpt. [pdf]
  9. [ICLR 2023] Learning on large-scale text-attributed graphs via variational inference. [pdf]
  10. [SIGIR 2023] Augmenting low-resource text classification with graph-grounded pre-training and prompting. [pdf]
  11. [PMLR 2023] Enhancing activity prediction models in drug discovery with the ability to understand human language. [pdf]
  12. [ICLR 2022] Node feature extraction by self-supervised multi-scale neighborhood prediction. [pdf]
  13. [arxiv 2022.12] Multi-modal molecule structure-text model for text-based retrieval and editing. [pdf]
  14. [arxiv 2022.09] A molecular multimodal foundation model associating molecule graphs with natural language. [pdf]
  15. [NIPS 2021] Graphformers: Gnn-nested transformers for representation learning on textualgraph. [pdf]
  16. [EMNLP 2021] Text2mol: Cross-modal molecule retrieval with natural language queries. [pdf]
  17. [arxiv 2020.08] Graph-based modeling of online communities for fake news detection. [pdf]

Contributors

We thank all the contributors to this list. And more contributions are very welcome.

About

Must-read papers on graph foundation models (GFMs)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published