Graph-structured data (GSD) is ubiquitous in real-life applications, which appears in many learning applications such as property prediction for molecular graphs, product recommendations from heterogeneous information networks, and logical queries from knowledge graphs. Recently, learning from graph-structured data has also become a research focus in the machine learning community. However, again due to such diversities in GSD, there are no universal learning models that can perform well and consistently across different learning applications based on graphs. In sharp contrast to this, convolutional neural networks work well on natural images, and transformers are good choices for text data. In this tutorial, we will talk about using automated machine learning (AutoML) as a tool to design learning models for GSD. Specifically, we will elaborate on what is AutoML, what kind of prior information from graphs can be explored by AutoML, and how can insights be generated from the searched models.


Time Event
12:00-12:45 Part 1: An introduction to Automated Machine Learning (AutoML). [Slides]
  Speaker: Quanming Yao
12:45-13:30 Part 2: Automated Graph Neural Network. [Slides]
  Speaker: Huan Zhao
13:30-14:15 Part 3: Automated Knowledge Graph Embedding. [Slides]
  Speaker: Yongqi Zhang
14:15-14:30 Part 4: Discussion


Quanming Yao, EE Department, Tsinghua University. Beijing. China.

Huan Zhao, 4Paradigm Inc. Beijing. China.

Yongqi Zhang, 4Paradigm Inc. Beijing. China.

Past Tutorial


Due to the space limitation, we only list highly-related papers.

  1. F. Hutter, L. Kotthoff, and J. Vanschoren, "Automated machine learning: methods, systems, challenges", Springer Nature, 2019.
  2. Q. Yao and M. Wang, “Taking human out of learning applications: A survey on automated machine learning”, tech. rep., arXiv preprint, 2018.
  3. T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks”, in ICLR, 2016.
  4. Y. Gao, H. Yang, P. Zhang, C. Zhou, and Y. Hu, "GraphNAS: Graph neural architecture search with reinforcement learning", in IJCAI, 2020.
  5. J. You, R. Ying, and J. Leskovec, "Design Space for Graph Neural Networks", in NeurIPS, 2020.
  6. H. Zhao, L. Wei, and Q. Yao, "Simplifying Architecture Search for Graph Neural Network", in CIKM-CSSA, 2020.
  7. H. Zhao, Q. Yao, and W. Tu, "Search to aggregate neighborhood for graph neural network", in ICDE, 2021.
  8. Z. Zhang, X. Wang, and W. Zhu, "Automated Machine Learning on Graphs: A Survey", in IJCAI, 2021.
  9. S. Ji, S. Pan, E. Cambria, P. Marttinen and P. S. Yu, “A Survey on Knowledge Graphs: Representation, Acquisition and Applications”, in TNNLS 2021.
  10. Z. Sun, Z. Deng, J. Nie and J. Tang, “RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space”, in ICLR 2019.
  11. X. Wang, X. He, Y. Cao, M. Liu and T. Chua, “KGAT: Knowledge Graph Attention Network for Recommendation”, in KDD 2019.
  12. Y. Zhang, Q. Yao, W. Dai and L. Chen, “AutoSF: Searching Scoring Functions for Knowledge Graph Embedding”, in ICDE 2020.
  13. Y. Zhang, Q. Yao and L. Chen, “Interstellar: Searching Recurrent Architecture for Knowledge Graph Embedding”, in NeurIPS 2020.