Voice assistants, like Alexa or Google Home are taking over households, in the USA anyway, where one in four households owns at least one device. These assistants that promise to make your life so much easier seem harmless, but are they really? TU Delft researcher Olya Kudina is not so sure: ‘They do impact our lives and not only in a good way.’

Bossing around

Kudina researches how technology shapes and changes the way we live and how it influences the way we interact. She became fascinated by voice assistants and the way they influence human beings while she was visiting friends in the United States three years ago. ‘My friend complained to me that her little son of three years old was being so rude to other children while playing in the sandbox. She didn’t understand where this behaviour came from, she definitely did not teach him to act like this. While having dinner however I observed how her husband commanded Alexa the voice assistant and became more and more agitated because the assistant failed to understand the command to dim the lights. And the little boy repeated after him….’

This incident sparked Kudina’s interest in this research topic on how AI-enabled voice assistants co-shape the values of communication, courtesy, respect, privacy and others through their presence at one’s home. ‘The parents had not realised but the way they talked to Alexa influenced the way the little boy spoke to others: commanding and frustrated. I was very curious to find out if other people experienced similar or other effects. Especially because the adoption rate of these voice assistant devices in the USA went faster than the adoption of cell phones. Which means the effects on how they redefine how we talk, interact and perceive one another in normal life, can be huge.’

Redefining communication

The way people are now bossing the world around through voice assistants has implications for the morality of a conversation. ‘It is unlikely that it will be the standard for communication though, for people can generally distinguish between the contexts and appropriate ways of communication. However this top-down way of speaking does seep down into everyday life. Certainly when interaction with voice-assistants becomes more common, it might become more difficult to distinguish at what point they will penetrate our everyday language. Even more so it will be difficult to tell apart the appropriate way of communication since what ‘appropriate communication’ means might evolve together with these technologies.’

A stupid smart speaker

It is quite astonishing that these home assistant devices are so popular, since their performance is often poor. The manufacturers promise a natural interaction that will make your life easier, answering any questions you might have and even managing the physical space of your home. Do you want to know what the weather will be like? There is no need to grab your cell phone and look it up, just ask the home assistant device and it will reply instantly. In reality a question often needs to be repeated several times and it needs to be very specific and direct to be understood. The smart speaker is not as smart as it is promoted to be. ‘Something that struck me was that people did not blame the machine for not working properly, they started to think they did something wrong. They did not speak clearly enough, not loud enough etc. In their view it is not the technology that is faulty. They want to keep the technology at all cost, even bond with it. They still feel that the ease outweighs the poor performance.’

A stupid smart speaker
A stupid smart speaker

Gender stereotype

Interesting is also that voice assistants predominantly have a female voice. This is because female voices fit better in the business marketing scheme. Kudina: ‘In general people find it more pleasant to listen to a female voice, a female voice also sounds less harsh and the person feels more inclined to speak more.’ That is exactly how the manufacturers like it, since ultimately these voice assistants are owned by commercial companies wanting to sell products. Alexa is owned by Amazon, Google home by Google. The more you talk to the speaker, the more they know about your interests and preferences. And the employees of those companies can listen to anything you’ve ever said to the speaker - so much for privacy! ‘Another downside is that the voice assistant never says no, is always compliant. What does that teach children about the role of women? Since the voice assistant technology is still very new, no research is done on this topic yet’, explains Kudina.

Not all bad

Does this mean that voice assistants only negatively affect human behaviour and human interactions? ‘No’, says Kudina. ‘They are known to have positive effects too. It can help children on the autism spectrum interact better in real life, since they can endlessly ‘practise’ with the speaker. Voice assistants help combat loneliness especially amongst the elderly who live alone since the smart speaker gives them ‘someone’ to talk to. Also for people who are visually impaired or have dyslexia the voice assistants are a great help. But there is still plenty of room for improvement of these devices in the field of ethical design, especially since they influence our interactions in everyday life and shape our moral landscape.’ The ethical dimension of technologies might be less obvious, but does not make it less significant.

Values in technology

‘What really strikes me with this technology is that there is seemingly no “in between”. Either you accept that the voice assistant intrudes your life, your privacy, or you don’t, but then you cannot use it at all. I feel that technology should be designed in such a way that ethical dimensions are incorporated, where you can use the technology under specific conditions that fit your values.‘ Luckily technological innovations are being developed by other companies who do incorporate ethical design in their products. ‘Project Alias, for instance, made “mushroom-like heads” that you can put on the top of the voice assistant speakers and help prevent against eavesdropping. Recently Q, a voice assistant with an androgynous voice came on the market.’

Meaningful interactions

Ideally a voice assistant would be designed that allows for meaningful interactions. ‘A first step for the AI-enabled voice assistants technology would be to better process human speech: beyond commands and with more nuance, such as processing sarcasm or jokes. To design them with values in mind and with an eye to ethical implications. In practice, it might mean saying “Thank you” to Alexa even when Alexa does not care: it is the user herself toward whom this intentional politeness is directed. My ultimate goal would be to test how this interactive speech would play out in the real world’, concludes Kudina.