As the field of neurotechnology moves from purely medical applications to new uses in consumer electronics, information processing, and human performance, it has attracted interest and investment from some of the giants of the tech industry. One year ago, internet entrepreneur Bryan Johnson told attendees at the Neurotech Leaders Forum of his $100 million investment in Kernel and his interest in expanding human intelligence using neurotechnology. Since that time, Tesla founder Elon Musk announced the formation of Neuralink to develop high bandwidth brain machine interfaces connecting humans and computers.
At the AdvaMed MedTech conference in San Jose, CA earlier this month, Regina Dugan, Facebook’s vice president of Building 8 projects (and a former DARPA director) outlined the company’s initial goal: developing a noninvasive BCI capable of transferring text directly from brain to computer at a rate of 100 words per minute. Dugan made a point of saying that Facebook was not interested in collecting users’ thoughts or other private information, but given that company’s recent scandal involving the Russian government’s exploitation of their technology in the 2016 presidential election, perhaps a more substantive statement is in order.
Other observers have raised some alarms. In an essay in Scientific American, Christopher Markou from Cambridge University expressed a reservation of Neuralink’s proposed whole brain interface. “I cannot imagine a scenario in which there would not be an endless number of governments, advertisers, insurers, and marketing folks looking to tap into the very biological core of our cognition to use it as a means of thwarting evildoers and selling you stuff,” he wrote.
Writing in the journal Life Sciences, Society, and Policy, Marcello Ienca, a neuroethicist at the University of Basel, and Roberto Andorno, a human rights lawyer at the University of Zurich, proposed four new human rights designed for neurotechnology applications. These are the rights to cognitive liberty, mental privacy, mental integrity, and psychological continuity. These rights would allow users to decide which neurotech tools they would use or not use, protect the contents of their mental state, and protect them from brain hacking.
We’d like to think that these rights are inherent in our constitution and laws. After all, if the first amendment protects freedom of speech, then certainly the zeroth amendment should protect freedom of thought. But voters long ago ceded that ground when they elected politicians who promised to outlaw certain mental states achieved with certain recreational drugs.
In the end, it’s up to all of us to be vigilant for the potential threats of neurotechnology at the same time as we welcome the benefits.
Editor and Publisher