The writer is founder of Sifted, a media site for European start-ups / Nature. A few weeks ago, Chile became the first country to enshrine ...
The writer is founder of Sifted, a media site for European start-ups / Nature. |
To date, most uses of neurotechnology have been benign (so far as we know) and are largely covered by medical regulations. The very idea of implanting electrodes into the brain may sound scary. But such brain-computer interface devices have helped temper the worst effects of Parkinson’s disease and improve the hearing of many hundreds of thousands of people. Both the US and the EU are funding big initiatives in this area to stimulate further medical research and innovation.
Yet it is not only addicts of dystopian science fiction who can imagine how such technology might serve more sinister ends. This is what the Chilean legislation aims to prevent. Guido Girardi, the senator who sponsored the neuro-rights agenda, has argued that brain-scanning devices might soon be able to read, and alter, people’s emotions and minds, affecting their “freedom, thought and free will”. This has become a question of fundamental human rights, which need to be protected, he believes.
It is worth taking such concerns seriously given the speed at which the technology is advancing and the amount of money pouring into the sector. More than $33.2bn has been invested in some 1,200 neurotech companies over the past decade or so, according to NeuroTech Analytics. Although much of this money has gone into medical companies, some of it has funded manufacturers of non-invasive brain-scanning devices, including wearable helmets and glasses, which are designed for more commercial uses and are almost entirely unregulated.
The opening kick of the 2014 World Cup in Brazil was made by Juliano Pinto, a paraplegic man using a mind-controlled robotic exoskeleton. Since then, researchers have used electrodes implanted in the brains of mice to generate false memories and manipulate their actions. “What can be done with mice today could be done with humans tomorrow,” Rafael Yuste, one of the world’s leading neuroscientists, has written.
Prof Yuste chairs the New York-based NeuroRights Foundation, which has been advising Chilean legislators and is pressing the UN to adopt a universal definition of neuro-rights. The foundation argues that these rights should include five main principles: non-interference with mental privacy, personal identity and free will, fair access to mental augmentation and protection from bias. It is also drawing up a technocratic oath, providing an ethical framework for entrepreneurs, researchers and investors developing neuro-technologies. The OECD has already published nine principles for responsible innovation in the field.
Some lawyers have questioned whether it makes sense to adopt specific legislation to defend neuro-rights, as they are doing in Chile. In many countries, legislation already safeguards the privacy and integrity of the whole body, rendering separate neuro-rights redundant.
Besides, law only reflects “minimal ethics” when higher standards are required, says Françoise Baylis, research professor at Dalhousie University, who has written on the subject. By defining personal identity too narrowly and focusing so heavily on human rights, she fears that we may absolve those who develop such technology from their broader responsibilities. “The better approach would be to adopt norms rather than laws,” she says. The profit motive cannot be the primary engine for the technology’s use.
As in so many other areas of fast-developing technology, ethics must be wired into a device’s initial design rather than added later as an afterthought. There can be little accounting for authoritarian governments, which will doubtless use neurotechnology for military, surveillance and interrogation purposes. But wherever possible, the industry should assume responsibility for how its products are used and prevent them from being abused.
Like the social media companies that are now facing a public reckoning after failing to anticipate how their platforms could inflame toxic debate, the neurotechnology industry will suffer a fearsome techlash if it fails in its duty.