When tinkering with our brain, we'd be better off having a philosopher think along with us

It already helps people dealing with severe depression or Parkinson's tremors. Stephen Rainey is a philosopher involved with the development of neurotechnology. Consortia of scientists and psychologists adjust their plans based on his inquiries. Unfortunately, this is not always the case. Five examples illustrate how it should (not) be done. 'In Delft, things are going very well.'

By Rianne Lindhout  •  February 20, 2024

 

© Getty Images

1. Deep Brain Stimulation for anxiety, obsessive-compulsive disorder or depression

An electrode deep in the brain can provide significant relief for people with severe anxiety, obsessive-compulsive disorder or symptoms of depression. Deep Brain Stimulation (DBS) becomes a last resort in cases where cognitive therapy and medication fail to make these disorders manageable. In the Netherlands, this is now covered by health insurance. Individuals with Parkinson's tremors or epilepsy sometimes also qualify for DBS.

'You turn on the current and the trembling stops', says Stephen Rainey, Senior Research Fellow in the Department of Ethics and Philosophy of Technology. 'Or individuals with severe depression can function again. But it's not known exactly how this works and it doesn't work for everyone. This raises an ethical question. It seems like a quick fix but comes with significant risks. It's highly invasive surgery. A hole is drilled in your skull and an electrode is inserted deep inside your brain. There is a risk of infections and other problems.' The electrode becomes less effective over the longer term. 'The brain rejects the electrode or counters its current with electrical activity of its own.'

Stephen Rainey © TU Delft

Rainey sees another problem. 'Is, for example, anxiety a disorder, a deficiency or just a difference? Such an electrode is calibrated to make you 'normal' but what is normal? There's a value judgment involved with that. And is the cause of a disorder something within the person, or is their job too stressful, or is there some other source of unrest?' Technology tends to become increasingly accessible. 'It’s currently still an invasive operation, so you think “it won’t be that bad”. But everyone knows that technology always advances. Devices are already being developed that can achieve the same effect from the outside with a magnetic field. This technology might soon become very common. Do we want that?'

 

2. Regaining the power of speech via an electrode

Rainey was closely involved in a European-funded project aiming to develop technology for use in decoding speech data from brain activity for people who can no longer speak due to paralysis. The speech prosthesis can recognize speech-related signals in the patient's brain and then recreate the sounds that would have been spoken were it not for the paralysis. 'A colleague philosopher and I were present in every conversation about the technology development. It was our job to introduce extra questions about the concepts the materials scientists, neuroscientists, computer scientists and psychologists were using.'

The philosophical input changed the project's outcome. Rainey: 'A computer scientist always wants as much data as possible, to improve the application. But we started thinking about what could happen in the future if we recorded all possible data from someone's brain. This application might end up like a lie detector that's not entirely accurate. Shouldn't we limit data recording so that it can only do what it really needs to do to prevent potential misuse in the future?' The project group then chose not to attempt direct thought reading but only to pick up signals from motor nerves that control the lips, tongue and throat. 'We also limited the language model that translates those signals into words. If it did too much, it could, like ChatGPT, sometimes make things up “on behalf of” the patient.'

 

3. Cochlear implant and smart glasses

The cochlear implant was perhaps the first surgical neurotechnological treatment, allowing deaf people to hear again by sending sound directly to the auditory nerve through electrical stimuli. In 2022, Stevie Wonder was enthusiastic about smart glasses developed at TU Delft. This seems much less complicated on an ethical-philosophical level but appearances can be deceiving, as Rainey illustrates.

'There is a whole discussion around disability. A cochlear implant suggests that being deaf is a flaw - that it has no value. This is questionable because, for example, that attitude denies the value of the deaf community.' Some people are very happy with the implant but there are also deaf and hard-of-hearing individuals who consciously choose not to have it. Sometimes due to the potentially very unnatural sound they will hear but also because they cherish their non-hearing or reduced hearing.

The existence of the cochlear implant means that the limitations of hearing impairment are now, to some extent, a hearing-impaired person's own choice. Rainey: 'This can lead to limitations becoming very emotionally charged. Deaf parents who don't want their child to have an implant can end up in unpleasant discussions with other people. But the question of what is best for the child is not easily answered.'

© Getty Images

4. Playing around: controlling a drone with your brain

A headset that records your brain activity can supposedly help you focus more by suggesting taking a break, for example. There are also drones on the market that you can control with your brain through a headset. Rainey: 'In reality, it may be the electrical activity in your facial muscles that control the drone, not “thoughts”. Yet, brain activity is being recorded. Various such gadgets come with data that is recorded and ends up with the manufacturer.' This is an example of the Internet of Things, where a manufacturer – or a hacker – can know when you are at home, for instance, because you turned on your heating via the app.

It's fun but as a consumer, you don't know where all that data ends up and what patterns algorithms can find in it. 'For manufacturers, that data is the fuel to train algorithms that can suddenly accelerate the development of this kind of technology in a possibly undesirable direction.' A technology can now be very cumbersome or invasive, like speech application or deep brain stimulation. But Elon Musk, with his company Neuralink, has set his sights on this kind of technology, and who knows, it might take off because of that. Neuralink recently inserted its first brain implant in a human. 'We need to think ahead and not just based on what we currently know.'

Rainey shows his students that to think clearly about this technology, it matters how you talk about it. 'Elon Musk talks about merging our brain with the internet. That sounds like magic and you're inclined to think: wow. You become enthusiastic about the exciting things you can do with it. But if you point out that the technology simply records the brain and enables you to make predictions about the brain with AI, you start to look at it differently. Questions arise: what could this be used for, and do we need to protect ourselves against something?'

 

5. Framed by a lie detector

In a murder case in India, a lie detector was used that signalled if a suspect showed recognition when seeing something or someone and, for example, the crime scene. This is very flimsy evidence because someone might recognize a part of an image but not the whole picture. Or someone might recognize the crime scene but not because they are the perpetrator. This example shows how neurotechnology seems to be crossing a line where people can be disadvantaged by what is recorded, predicted or characterized. 'This is a real risk that human rights must respond to and we must discuss it as a society. In Chile, the constitution has been amended but we need to organize this internationally. We must have the right to say no to such things, to be protected against the possibility of a large technology company unscrupulously using such technology.'

 

Delft researchers feel a sense of responsibility

Stephen Rainey was appointed as Senior Research Fellow in Delft at the end of 2022. He also has three appointments in England and occasionally works for the European Commission. His direct involvement in Delft's neurotechnology projects is still in the early stages. 'However, TU Delft recently joined EBRAINS, which emerged from a project I worked on around 2015. At that time, we discussed how to share brain data ethically.' EBRAINS is a platform for public neuroscientific data, computer models and software for researchers, clinicians and students.

The neurotechnologists in Delft are hard at work, Rainey observes. 'They are working on project proposals and there is a lot of funding. I meet many researchers now and am involved in devising their research design.' This is already much better than before, as he further elaborates: 'Around 2010, research consortia approached ethicists about a week before the deadline but now we are often involved in the research proposal from the beginning. This is certainly not the case everywhere but it’s going well in Delft.’ According to Rainey, this is also partly due to the Innovation and Impact Center, which is very proactive in helping researchers with grant applications. ‘If necessary, they establish contact with someone like me if they see someone working on a neurotechnology proposal. I have noticed that the idea of systems thinking and social responsibility is strongly present in Delft.'