Workshop: Navigating the Interplay of Explainability and Privacy in AI

08 februari 2024 08:00 t/m 09 februari 2024 17:00 - Locatie: Aula Conference Centre of TU Delft – Commissiekamer 3 (Mekelweg 5, 2628 CC Delft) | Zet in mijn agenda

The Delft Design for Values Institute hosts a workshop series on Value and Value Conflicts. On February 8 and 9 they are Navigating the Interplay of Explainability and Privacy in AI.

In the ever-evolving landscape of artificial intelligence, where ethical considerations intersect with innovation, the Delft Design for Values Institute (DDfV) aims to develop a workshop series on values and value conflicts in AI systems, with a specific focus on their application in real-world domains. With their workshop series, they are dedicated to supporting the DDFV Institute in becoming a crucible of knowledge, fostering discussions that will guide AI development towards a more responsible and equitable future. Together, you will delve into the intricate web of values, biases, and choices that underlie AI systems, recognizing that the deployment and use of these powerful techniques in real-world domains such as healthcare, finance, and beyond carry unique responsibilities and challenges.

Their inaugural workshop is hosted by Dr. Megha Khosla, Assistant Professor in the Multimedia Computing Group in the Intelligent Systems Department at TU Delft. In this series they will hone in on the values of explainability and privacy, addressing their critical role in the development of AI models for healthcare and medicine. The DDfV recognizes that in this field, the stakes are exceptionally high, as AI technologies have the potential to revolutionize patient care, diagnosis, and treatment. However, with great innovation comes even greater responsibility. You will explore the intricate interplay of these values, highlighting the need and challenges of making our methods more transparent while preserving the privacy of employed data.

Interested? You can find the full programme and register by clicking the button below: