Event
Public PhD defence of Esther Rodrigo Bonet on October 9
 
 

On October 9th 2024 at 16:30, Esther Rodrigo Bonet will defend their PhD entitled “EXPLAINABLE AND PHYSICS-GUIDED GRAPH DEEP LEARNING FOR AIR POLLUTION MODELLING”.

Everybody is invited to attend the presentation in room I.0.02.

Abstract 

Air pollution has become a worldwide concern due to its negative impact on the population’s health and well-being. To mitigate its effects, it is essential to monitor pollutant concentrations across regions and time accurately. Traditional solutions rely on physics-driven approaches, leveraging particle motion equations to predict pollutants’ shifts in time. Despite being reliable and easy to interpret, they are computationally expensive and require background domain knowledge. Alternatively, recent works have shown that data-driven approaches, especially deep learning models, significantly reduce the computational expense and provide accurate predictions; yet, at the cost of massive data and storage requirements and lower interpretability.

This PhD research develops innovative air pollution monitoring solutions focusing on high accuracy, manageable complexity, and high interpretability. To this end, the research proposes various graph-based deep learning solutions focusing on two key aspects, namely, physics-guided deep learning and explainability.

First, as there exist correlations among the data points in smart city data, we propose exploiting them using graph-based deep learning techniques. Specifically, we leverage generative models that have proven efficient in data generation tasks, namely, variational graph autoencoders. The proposed models employ graph convolutional operations and data fusion techniques to leverage the graph structure and the multi-modality of the data at hand. Additionally, we design physics-guided deep-learning models that follow well-studied physical equations. By updating the graph convolution operator of graph convolutional networks to leverage the physics convection-diffusion equation, we can physically guide the learning curve of our network.

The second key point relates to explainability. Specifically, we design novel explainability techniques for interpretable graph deep modeling. We explore existing explainability algorithms, including Lasso and a layer-wise relevance propagation approach, and go beyond them to our graph-based architectures, designing efficient and specifically tailored explanation tools. Our explanation techniques are able to provide insights and visualizations based on various input data sources.

Overall, the research has produced state-of-the-art models that combine the best of both (physics-guided) graph-deep-learning-based and explainable approaches for inferring, predicting, and explaining air pollution. The developed techniques can also be applied to various applications in modeling graphs on the Internet such as in recommender systems’ applications.

 
 
Related content 
 
 
...
Public PhD defence of Eden Teshome Hunde on November 15
  Event
12 November 2024
...
Public PhD defence of Boris Joukovsky on November 7
  Event
4 November 2024
...
Public PhD defence of Yuqing Yang on October 25
  Event
14 October 2024