Topics

Note: these topics are preliminary and should not be treated as final until October 21st.

Topic 1: Expressiveness -- GNNs and the Weisfeiler-Lehman test

"Vanilla" GNNs are unable to serve as universal function approximators for the graph domain due to their inability to distinguish some isomorphic graphs. This property manifests within their close relation to the WL-test for graph isomorphism. Understanding this relation is foundational to understanding aspects of the expressiveness of GNNs and consequent limitations.

Core references:

TBA

 

Topic 2: Expressiveness -- Common bottlenecks of GNNs and how to overcome them

Akin to the pitfall of recurrent NNs, GNNs also suffer from a bottleneck that hinders the propagation of information between distant nodes. Understanding this oversquashing bottleneck and devising solutions can substantially improve the performance of GNNs on long-range interaction problems.

Core references:

https://arxiv.org/pdf/2006.05205

 

Topic 3: Graph transformers and graph positional encodings

Self-attention and message passing are closely related mechanisms for propagating information in modern neural networks. Graph transformers combine advances from both the respective Transformer and GNN communities, giving rise to a new, powerful class of models for working with graph-structured data. Graph positional encodings offer a simple way of encoding additional information about the graph structure into node embeddings and are essential for graph transformers, but can also be used on their own in combination with conventional GNNs.

Core references:

TBA

 

Topic 4: Geometric GNNs and their application to molecular property prediction

A natural extension to representing molecules as graphs is to embed them into euclidean space by assigning appropriate3D coordinates to each atom/node. Geometric GNNs adapt the conventional message passing framework to this type of geometric graph and can be shown to improve performance on some property prediction tasks.

Core references:

TBA

 

Topic 5: Hierarchical message passing

Hierarchical methods learn to operate on a multi-level representation of the graph, rather than just a flat node-to-node message-passing scheme. This approach helps the GNN capture richer structural information, such as long-range dependencies and hierarchical patterns that are difficult to capture using traditional GNNs.

Core references:

https://arxiv.org/pdf/1806.08804

Topic 6: Asynchronous message passing

 

Core references:

TBA

 

Topic 7: DiGress: discrete diffusion denoising for graph generation

Applications such as drug discovery or material sciences may benefit from generative models for graphs. DiGress is a recent diffusion model that is based on discrete diffusion

Core references:

https://arxiv.org/pdf/2209.14734

 

Topic 8: Explainable AI methods for GNNs

 

Core references:

TBA

 

Topic 9: GCPN: Combining GNNs and reinforcement learning for conditional molecular generation

GCPN is a less recent, but straightforward application, where a GNN-based policy neural network is trained to generate molecular graphs that optimize a moelcular property of interest while remaining faithful to a set of reference molecules.

Core references:

https://arxiv.org/pdf/1806.02473

Privacy Policy | Legal Notice
If you encounter technical problems, please contact the administrators.