[공지사항]을 빙자한 안부와 근황 
Show more

Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

Proficient Graph Neural Network Design by Accumulating Knowledge on Large Language Models

Created by
  • Haebom

Author

Jialiang Wang, Hanmo Liu, Shimin Di, Zhili Wang, Jiachuan Wang, Lei Chen, Xiaofang Zhou

Outline

In this paper, we propose DesiGNN, a knowledge-centric framework, to address the challenges of automatically designing graph neural networks (GNNs) using large-scale language models (LLMs). DesiGNN transforms existing model design experiences into a structured knowledge dictionary, which is utilized for meta-learning of LLMs, and performs empirical feature filtering and adaptive information gathering through literature analysis using benchmarks and LLMs. Through this, it builds meta-knowledge between understanding unknown graph data and effective architectural patterns, proposes a top-level GNN model in a short time, and achieves superior performance at a much lower cost than existing methods.

Takeaways, Limitations

Takeaways:
A new approach to automatic GNN design using LLM
Suggesting efficient GNN design and performance improvement potential through DesiGNN
Demonstrating the utility of automated model design based on meta-knowledge
Presenting a new paradigm for data-centric model design
Limitations:
The performance of DesiGNN may depend on the LLM used and the benchmark data.
Need to evaluate generalization performance for various types of graph data
Further research is needed on the scalability of DesiGNN and its applicability in real-world applications.
Additional research may be needed on the interpretability of the designed GNN model.
👍