In our research, we study the role of values in technological innovations, bringing together experts from economics, risk and safety science, and philosophy. This combination of disciplines in one overarching department is unique in the world. It fills an important academic and societal lacuna, by contributing to research on responsible innovation in a multidisciplinary way, combining empirical, quantitative, conceptual and normative approaches.
How can we make innovations more responsible, so that processes as well as products of innovation do justice to important social and ethical values, also taking into account that the effects of innovations are often uncertain or unknown, and that innovation takes place in sociotechnical systems?
The research mission of the VTI department is to contribute to responsible innovation by:
- Identifying, analyzing and improving the attention for the value and responsibility dimensions of governance, engineering and technology from a sociotechnical systems perspective, with special attention to (but not limited to) values such as safety, security, efficiency, justice, privacy, and sustainability, and to trade-offs between these values, and with a focus on design, innovation and diffusion of responsible innovation.
- Studying the institutional design of large sociotechnical systems, innovation processes and systems, and the role of entrepreneurship in innovation with special attention for their value dimension, with the aim of identifying opportunities for making innovations and innovations processes (more) responsible.
- Developing, applying and empirically testing theories, methodologies, methods, approaches, tools, conceptualizations for, or contributing to, responsible innovation.
Main research question: How can values be integrated into the design of technologies, institutions and sociotechnical systems?
One of the main approaches to responsible innovation that the VTI department is working on is Design for Values, which aims at integrating non-instrumental values in the design of new technologies and innovations from the start. This raises a number of challenges. One challenge is how to deal with value conflicts in design. Value conflicts may arise because different stakeholders hold different values but also because a technology can usually not meet all values that are relevant for its design, so that decisions have to be made for example through value trade-offs. Another challenge is the role of institutions and how they should be designed in order to justice to values and responsible innovation. Institutions can often not be designed from scratch because they usually already exist and evolve over time. This requires insights both in how institutions develop, how institutions impact technological development and how they relate to values. A third challenge has to do with the value of responsibility. Determining and enhancing responsibility in sociotechnical systems is often problematic. Innovation and design are collective efforts and the causal chains between the innovators and the eventual social effects are long. In addition, a range of new technologies raise new responsibility problems; think for example of drones, robots and self-driving cars that autonomously make decisions. This may result in tensions between collective and individual responsibility and in responsibility gaps.
Main research question: How can we operationalize, manage and incentivize responsible innovation in sociotechnical systems?
We make a novel contribution to innovation systems research by incorporating responsible innovation and a value dimension. We do this by building on insights that we have developed in the past with respect to innovation management. However, our more recent focus on managing responsible innovation gives rise to new challenges. One challenge is that making innovations more responsible often requires breaking through existing patterns of innovation. From innovation studies, it is known that new players, like start-ups, may play a crucial role in doing so. Studying the role of entrepreneurship for responsible innovation is therefore very important. Another important challenge is how to operationalize and incentivize responsible innovation in sociotechnical systems. For example, an intrinsic motivation to take responsibility by making innovations more sustainable might be “crowded out” by economic incentives. Moreover, in many sociotechnical systems, such as energy, transport and communication infrastructures, the incentive structure is embedded in the sector regulations and/or public oversight. This raises the question how we can stimulate responsibility in sociotechnical systems given that incentive structures are usually difficult to change and given that incentive structures that work on the short term might have detrimental effects (due to “crowding out”) in the long run. This requires empirical studies and indicator development in combination with a normative perspective.
Main research question: How are we to assess, manage and evaluate the risks of technologies and sociotechnical systems in a responsible way?
Risk is a key concern when it comes to responsible innovation. Effects of (responsible) innovation are frequently uncertain and may only surface once technologies are introduced into society. This raises the question how to model and predict risks taking into account technological, organizational and human factors. In order to address this, an important approach we develop and apply is that of Bayesian Belief Networks (BBNs) for risk assessment. A second challenge is how to integrate safety (unintentional harm) and security (intentional harm) in risk assessment and management, as safety and security increasingly interact and depend on each other. Here, among other things, we apply game theory to better understand and model such interactions. A third challenge has to with the question as to how safe is safe enough. We investigate how moral values can be integrated in risk assessment and risk management, while also paying attention to an economic point of view. This gives rise to the question as to how to take into account the role of moral values and of emotions in the already existing more formal frameworks for making decisions about acceptable risk. From a philosophical point of view, the topic of risk requires new theories because traditional ethical theories have a hard time dealing with probabilities and uncertainties. The Ethics and Philosophy of Technology Section is internationally a key player in this new domain of risk ethics. A fourth challenge is dealing with the fact that risks cannot be completely predicted or anticipated. We investigate this for example by considering the introduction of new technology into society as a social experiment in which risks and benefits only gradually become clear.