Seminar Graphs&Data@TU Delft - 9th Nov

9th November

09 November 2023 10:30 till 12:00 | Add to my calendar

This is a series of seminars/talks bringing together people from all TU Delft doing research on Graphs and Data who could benefit from the interchange of ideas with colleagues on different topics.  

This seminar took place on Thursday 9th November, from 10:30 to 12:00.

You can download all of the presented slides here

 

Speakers

Megha Khosla

Recording: here

Title: Explainable Graph Machine Learning : Challenges and Solutions

Abstract: Graph-based machine learning (GraphML) has led to state of the art  improvements in various scientific tasks, yet its opaque nature hinders its full potential in sensitive domains. In this talk, I will commence by providing a brief overview of prevalent notions of post-hoc explanations for GraphML techniques, shedding light on the intricacies of finding and evaluating these explanations. I will present our proposed effective solutions and conclude by exploring intriguing problems that continue to stimulate research in this domain.

 

Tianqi Zhao

Recording: here

Title: Multi-label Node Classification On Graph-Structured Data

Abstract: Graph Neural Networks (GNNs) have exhibited remarkable progress in node classification on graphs, particularly in a multi-class setting. However, their applicability to the more realistic multi-label classification scenario, where nodes can have multiple labels, has been largely overlooked. This talk will unveil the limitations of current GNNs in handling multi-label classification tasks. I will also highlight how existing GNNs, designed for either homophilic or heterophilic characteristics of node labels, fall short in capturing the nuanced complexities of multi-label datasets that don't conform to such clear distinctions.

 

Ojas Shirekar

Recording: here

Title: Self-Attention Message Passing for Contrastive Few-Shot Image Classification

Abstract: Humans have a unique ability to learn new representations from just a handful of examples with little to no supervision. Deep learning models, however, require an abundance of data and supervision to perform at a satisfactory level. Unsupervised few-shot learning (U-FSL) is the pursuit of bridging this gap between machines and humans. Inspired by the capacity of graph neural networks (GNNs) in discovering complex inter-sample relationships, we propose a novel self-attention based message passing contrastive learning approach (coined as SAMP-CLR) for U-FSL pre-training. This work also proposes an optimal transport (OT) based fine-tuning strategy (called OpT-Tune) to efficiently induce task awareness into our novel end-to-end unsupervised few-shot classification framework (SAMPTransfer).