Topics
Note: these topics are preliminary and should not be treated as final until October 24.
Also note that the core references of your assigned topic serve only as a starting point. This is by no means a must, but we encourage you to read and include additional material for your notebooks and talks in consultation with your supervisor.
Topic 1: GNNs, the Weisfeiler-Lehman test and Graph Isomorphism Networks
Conventional GNNs are unable to serve as universal function approximators for the graph domain due to their inability to distinguish some non-isomorphic graphs. This property manifests within their close relation to the 1-WL-test for graph isomorphism. Understanding this relation is foundational to understanding aspects of the expressiveness of GNNs and the resulting limitations.
Core references:
Xu, Keyulu, et al. "How powerful are graph neural networks?." (2019)
Supervisor:
Topic 2: Common bottlenecks of GNNs and how to overcome them
Akin to the pitfall of conventional recurrent NNs, GNNs also suffer from a bottleneck that hinders the propagation of information between distant nodes. Understanding this oversquashing bottleneck and devising solutions can substantially improve the performance of GNNs on long-range interaction problems.
Core references:
Supervisor:
Topic 3: Graph transformers and graph positional encodings
Self-attention in transformer neural networks and message-passing in GNNs are closely related mechanisms for propagating information among tokens or nodes respectively. Graph transformers combine advances from both the transformer and GNN community, giving rise to a novel, powerful class of models for working with graph-structured data. Graph positional encodings offer a simple way of encoding additional information about the graph structure into node embeddings and are essential for graph transformers, but can also be used on their own in combination with conventional GNNs.
Core references:
Rampášek, Ladislav, et al. "Recipe for a general, powerful, scalable graph transformer." (2022)
Supervisor:
Topic 4: Geometric GNNs and their application to molecular property prediction
A natural extension to representing e.g. molecules as graphs is to embed them into euclidean space by assigning appropriate 3D coordinates to each atom/node. Geometric GNNs adapt the conventional message passing framework to this type of geometric graph. Empirically, this type of molecular representation and graph neural network can improve performance on many tasks in molecular machine learning such as property prediction significantly.
Core references:
Cremer, Julian, et al. "Equivariant graph neural networks for toxicity prediction." (2023)
Supervisor:
Topic 5: Hierachical GNNs
Hierarchical methods learn to operate on a multi-level representation of the graph, rather than just a flat node-to-node message-passing scheme. This approach helps the GNN capture richer structural information, such as long-range dependencies and hierarchical patterns that are difficult to capture using traditional GNNs. DiffPool is a general framwork for learning hierachical representations, whereas "Molecular Hypergraph Neural Networks" apply this concept to molecular ML.
Core references:
Chen, Junwu, and Philippe Schwaller. "Molecular hypergraph neural networks." (2024)
Supervisor:
Topic 6: Asynchronous message passing
A recent approach to overcoming the limitations of conventional GNNs (see Topic 1 & 2) is based on replacing synchronous message-passing with asynchronous communication between nodes.
"Cooperative Graph Neural Networks" is a seminal work that introduces a new GNN framework based on this idea.
Core references:
Finkelshtein, Ben, et al. "Cooperative Graph Neural Networks." (2024)
Supervisor:
Topic 7: DiGress: discrete diffusion denoising for graph generation
Applications such as drug discovery or material sciences may benefit immensely from generative models for graphs. DiGress is a recent diffusion model that is based on discrete denoising diffusion. When first published it set new standards both for conditional and unconditional genration of small molecular graphs.
Core references:
Vignac, Clement, et al. "Digress: Discrete denoising diffusion for graph generation." (2023)
Supervisor:
Topic 8: GNNExplainer
GNNExplainer is a widely-used approach for identifying compact subgraph structures and node features that play a crucial role in the predictions made by a GNN.
Core references:
Ying, Zhitao, et al. "GNNexplainer: Generating explanations for graph neural networks." (2019)
Long et al. "Explaining the Explainers in Graph Neural Networks: a Comparative Study" (2024)
Supervisor:
Topic 9: GCPN: Combining GNNs and reinforcement learning for conditional molecular generation
GCPN is a less recent, but straightforward application, where a GNN-based policy neural network is trained to generate molecular graphs that optimize a moelcular property of interest while remaining faithful to a set of reference molecules.