Keynote Speakers

Monika Henzinger, University of Vienna

Recent Advances in Fully Dynamic Graph Algorithms

Abstract: In recent years, significant advances have been made in the design and analysis of fully dynamic algorithms. However, these theoretical results have received very little attention from the practical perspective. Few of the algorithms are implemented and tested on real datasets, and their practical potential is far from understood. Here, we present a quick reference guide to recent engineering and theory results in the area of fully dynamic graph algorithms.

Paul Spirakis, University of Liverpool and University of Patras

Algorithmic Problems on Temporal Graphs

Abstract: Research on Temporal Graphs has expanded in the last few years. Most of the results till now, address problems related to the notion of Temporal Paths (and Temporal Connectivity). In this talk, we focus, instead, on problems whose main topic is not on Temporal Paths. In particular, we will discuss Temporal Vertex Covers, the notion of Temporal Transitivity, and also issues and models of stochastic temporal graphs. We believe that several algorithmic graph problems, not directly related to paths, can be raised in the temporal domain. This may motivate new research towards lifting more topics of algorithmic graph theory to the temporal case.

Roger Wattenhofer, ETH Zurich

Networks, Dynamics, Algorithms, and Learning

Abstract: Networks are notoriously difficult to understand, and adding dynamics does not help. Can the current wonder weapon of computation (yes, machine learning) come to the rescue? Unfortunately, learning with networks is generally not well understood. ``Neural network networks’’ (better and less confusingly known as graph neural networks) can learn simple graph patterns, but they are a far cry from their impressive machine learning cousins in the image- or the game-domain. In my opinion, the most astonishing graph neural networks are in fact dealing with dynamic networks: They simulate sand (the granular material, not the symposium) quite naturally. In my talk, I will discuss and compare different computational objects and paradigms: networks, dynamics, algorithms, and learning. What are the differences? And what can they learn from each other? In the technical part of the talk, I will present DropGNN, our new algorithm-inspired approach for handling graph neural networks. But mostly I will vent about misunderstandings and mistakes, and I will propose open questions, and new research directions.

DropGNN is joint work with Pál András Papp, Karolis Martinkus, and Lukas Faber, published at NeurIPS, December 2021.