Our research on responsible innovation

We study responsible innovation with regard to a broad range of technologies and application areas, like ICT (including AI and robotics), energy, water, transportation, the chemical industry, health care and biotechnologies. We do this (a) from the perspective of societal, public and ethical values and (b) within the context of sociotechnical systems.

Societal challenges require responsible innovation

Developments such as climate change, the energy transition, digitisation, AI, robotisation and new health technologies have major implications for society, and policy makers are grappling with how to deal with these issues.

An innovation for which there is not enough public support, runs the risk of failing. The positive contribution that the innovation could have made to society is then lost. But it also happens that an innovation is pushed through despite a lack of public support, which sometimes means that valid concerns have been ignored.

How can we realize the potential benefits of innovations in a responsible way? That is the central question around which our research revolves.

Our distinctive focus: values & socio-technical systems

Responsible innovation is the alignment of technological and institutional innovation with societal and ethical values, needs and expectations. This involves a broader range of values than the instrumental values that are common in engineering (such as effectiveness, safety, compatibility and reliability). For example, we also look at sustainability, security, autonomy, freedom, privacy, transparency, justice, equity, democracy, diversity and inclusion.

Understanding the institutional design of socio-technical systems and the policy dimensions of innovation processes is equally important for the further development, social acceptance and moral acceptability of new technologies. We thus study innovations not only from a value perspective, but also in the context of the wider socio-technical systems that they are part of.

Our unique profile: economics, safety science & philosophy

Responsible innovation requires combining empirical studies with normative approaches. It requires both qualitative and quantitative research, as well as ethical reflection and value-sensitive economic and risk models. The Department of Values, Technology & Innoation therefore brings together experts from economics, safety and security science, and ethics/philosophy.

Our researchers study responsible innovation grounded in their own disciplinary expertise, but they also collaborate with researchers from other fields. When properly integrated, insights from various disciplines may lead to innovations that are not only technologically sound, but also economically feasible, socially accepted and morally acceptable.

Our aim: both societal and academic impact

Our research agenda is not only relevant to society, but also leads to new research questions and innovative methodologies. We develop, empirically test and apply theories, methods, approaches, tools and conceptualisations for, or contributing to, responsible innovation.

Most of our research consists of an iterative process between addressing real societal challenges and underlying theoretical challenges. Our more foundational research feeds into our inter-/multidisciplinary research on responsible innovation. In this way, we aim to combine the highest research quality with societal relevance.

Our research portfolio

We continuously strive for a high-quality and breakthrough academic research portfolio that is

  • engaged with societally relevant questions in order to create societal impact
  • a fruitful mix of foundational, applied and inter-/multidisciplinary research
  • based on a diversity of perspectives
  • optimally informed about concrete technologies and specific challenges
  • resulting in direct inputs into policy, industry and society

In that way, we aim to contribute to a more just and sustainable society.

Concrete examples

Looking for some concrete examples of our research? You can find them here: