Brain-reading tech is coming. The law is not ready to protect us.

Brain-reading tech is coming. The law is not ready to protect us.

Javier Zarracina/Vox

In the era of neurocapitalism, your brain needs new rights.

“Nothing was your own except the few cubic centimeters inside your skull.” That’s from George Orwell’s dystopian novel 1984, published in 1949. The comment is meant to highlight what a repressive surveillance state the characters live in, but looked at another way, it shows how lucky they are: At least their brains are still private.

Over the past few months, Facebook and Elon Musk’s Neuralink have announced that they’re building tech to read your mind — literally.

Mark Zuckerberg’s company is funding research on brain-computer interfaces (BCIs) that can pick up thoughts directly from your neurons and translate them into words. The researchers say they’ve already built an algorithm that can decode words from brain activity in real time.

And Musk’s company has created flexible “threads” that can be implanted into a brain and could one day allow you to control your smartphone or computer with just your thoughts. Musk wants to start testing in humans by the end of next year.

Other companies such as Kernel, Emotiv, and Neurosky are also working on brain tech. They say they’re building it for ethical purposes, like helping people with paralysis control their devices.

This might sound like science fiction, but it’s already begun to change people’s lives. Over the past dozen years, a number of paralyzed patients have received brain implants that allow them to move a computer cursor or control robotic arms. Implants that can read thoughts are still years away from commercial availability, but research in the field is moving faster than most people realize.

Your brain, the final privacy frontier, may not be private much longer.

Some neuroethicists argue that the potential for misuse of these technologies is so great that we need revamped human rights laws — a new “jurisprudence of the mind” — to protect us. The technologies have the potential to interfere with rights that are so basic that we may not even think of them as rights, like our ability to determine where our selves end and machines begin. Our current laws are not equipped to address this.

4 new rights we may need enshrined in law

Several countries are already pondering how to handle “neurorights.” In Chile, thanks in part to the advocacy of neuroscientist Rafael Yuste, the government agreed in October to provide official backing for a NeuroProtection agenda that would make brain data protection a human right. In Europe, the OECD adopted in December a set of nine new principles for regulating the use of brain data — the first international standard in this field.

One of the main people pushing for these new human rights is neuroethicist Marcello Ienca, a researcher at ETH Zurich, one of Europe’s top science and technology universities. In 2017, he released a paper outlining four specific rights for the neurotechnology age he believes we should enshrine in law. I reached out to ask what he thought of the recent revelations from Facebook and Neuralink.

“I’m very concerned about the commercialization of brain data in the consumer market,” he said. “And I’m not talking about a farfetched future. We already have consumer neurotech, with people trading their brain data for services from private companies.” He mentioned neurogaming, where you control your movements in a video game using your brain activity rather than a traditional controller, and self-tracking, where you use wearable devices to, say, monitor your sleep. “I’m tempted to call it neurocapitalism.”

BCI tech includes systems that “read” neural activity to decode what it’s already saying, often with the help of AI-processing software, and systems that “write” to the brain, giving it new inputs to actually change how it’s functioning. Some systems do both.

When I asked Ienca to explain each of the four human rights he believes we need and to give a concrete example of how neurotechnology might violate it, he came up with some frightening scenarios, some of them already underway. Let’s break them down.

1. The right to cognitive liberty

You should have the right to freely decide you want to use a given neurotechnology or to refuse it.

In China, the government is already mining data from some employees’ brains by having them wear caps that scan their brainwaves for depression, anxiety, rage, or fatigue. “If your employer wants you to wear an EEG headset to monitor your attention levels, that might qualify as a violation of the cognitive liberty principle,” Ienca said, because even if you’re told that wearing the device is optional, you’ll probably feel implicit pressure to do so since you don’t want to be at a competitive disadvantage.

He added that the US military is also looking into neurotechnologies to make soldiers more fit for duty. Down the line, that could include ways to make them less empathetic and more belligerent. Soldiers may be pressured to accept interventions.

“There is already military-funded research to see if we can monitor decreases in attention levels and concentration, with hybrid BCIs that can ‘read’ deficits in attention levels and ‘write’ to the brain to increase alertness through neuromodulation. There are DARPA-funded projects that attempt to do so,” Ienca said, referring to the Defense Department’s advanced research agency.

2. The right to mental privacy

You should have the right to seclude your brain data or to publicly share it.

Ienca emphasized that neurotechnology has huge implications for law enforcement and government surveillance. “If brain-reading devices have the ability to read the content of thoughts,” he said, “in the years to come governments will be interested in using this tech for interrogations and investigations.”

The right to remain silent and the principle against self-incrimination — enshrined in the US Constitution — could become meaningless in a world where the authorities are empowered to eavesdrop on your mental state without your consent.

It’s a scenario reminiscent of the sci-fi movie Minority Report, in which a special police unit called the PreCrime Division identifies and arrests murderers before they commit their crimes.

3. The right to mental integrity

You should have the right not to be harmed physically or psychologically by neurotechnology.

BCIs equipped with a “write” function can enable new forms of brainwashing, theoretically enabling all sorts of people to exert control over our minds: religious authorities who want to indoctrinate people, political regimes that want to quash dissent, terrorist groups seeking new recruits.

What’s more, devices like those being built by Facebook and Neuralink may be vulnerable to hacking. What happens if you’re using one of them and a malicious actor intercepts the Bluetooth signal, increasing or decreasing the voltage of the current that goes to your brain — thus making you more depressed, say, or more compliant?

Neuroethicists refer to that as brainjacking. “This is still hypothetical, but the possibility has been demonstrated in proof-of-concept studies,” Ienca said, adding, “A hack like this wouldn’t require that much technological sophistication.”

4. The right to psychological continuity

You should have the right to be protected from alterations to your sense of self that you did not authorize.

In one study, an epileptic woman who’d been given a BCI came to feel such a radical symbiosis with it that, she said, “It became me.” Then the company that implanted the device in her brain went bankrupt and she was forced to have it removed. She cried, saying, “I lost myself.”

Ienca said that’s an example of how psychological continuity can be disrupted not only by the imposition of a neurotechnology but also by its removal. “This is a scenario in which a company is basically owning our sense of self,” he said.

Another threat to psychological continuity comes from the nascent field of neuromarketing, where advertisers try to figure out how the brain makes purchasing decisions and how to nudge those decisions along. The nudges operate below the level of conscious awareness, so these noninvasive neural interventions can happen without us even knowing it. One day a neuromarketing company is testing a subliminal technique; the next, you might find yourself preferring product A over product B without quite being sure why.

Consumer advocate organizations have raised the alarm about neuromarketing. Jeff Chester, the executive director of the Center for Digital Democracy, has said that adult advertising should be regulated “if the advertising is now purposely designed to bypass those rational defenses” that formerly allowed us to discern what’s true and untrue.

“Brain data is the ultimate refuge of privacy”

Given the worries about neurocapitalism, I asked Ienca whether neurotechnologies should be taken out of the control of private companies and reclassified as public goods. He said yes — both to prevent companies from inflicting harm and to prevent them from affording benefits only to rich people who can pay for their products.

“One risk is that these technologies could become accessible only to certain economic strata and that’ll exacerbate preexisting social inequalities,” he said. “I think the state should play an active role in ensuring these technologies reach the right people.”

It’s hard to say whether Ienca’s neurorights or the OECD’s or Chile’s will effectively keep neurotechnology’s risks in check. But given how fast this tech is developing, it does seem likely that we’ll need new laws to protect us, and now is the time for experts to articulate our rights. Lawmakers move slowly, and if we wait for devices like Facebook’s or Neuralink’s to hit the market, it could already be too late.

“Brain data is the ultimate refuge of privacy. When that goes, everything goes,” Ienca warned. “And once brain data is collected on a large scale, it’s going to be very hard to reverse the process.”

Sign up for the Future Perfect newsletter. Twice a week, you’ll get a roundup of ideas and solutions for tackling our biggest challenges: improving public health, decreasing human and animal suffering, easing catastrophic risks, and — to put it simply — getting better at doing good.

Author: Sigal Samuel

Read More

RSS
Follow by Email