Architecture and Hardware News

Neurotechnology and the Law

How closely should implants be regulated?
  1. Introduction
  2. High Expense, High Risk
  3. Self-Regulation Over Legislation
  4. Exploitation, Potential Hacks, and Ethical Concerns
  5. Author
computer chip embedded into a human brain, illustration

As brain implants become more commonplace and may eventually be used for non-medical purposes, some experts believe they must be regulated.

Regulations should be considered “a natural next step,” says Rajesh P. N. Rao, a professor at the University of Washington in Seattle with a background in computer science, engineering, and computational neuroscience, who earned his Ph.D. in artificial intelligence (AI)/computer vision, and used a postdoctoral scholarship to train in neuroscience.

Eventually, there will be two-way communication between doctors and the devices, with AI as an intermediary, Rao says. “In the future, that kind of device embedded with AI can look at what’s happening in other parts of the brain to treat depression or epilepsy and stopping seizures and bridging an injured area of the brain or shaping the brain to be less depressed.”

Efforts are under way to further the use of these devices. For example, BrainGate is a U.S.-based multi-institutional research effort to develop and test novel neurotechnology aimed at restoring communication, mobility, and independence. It is geared at people who still have cognitive function, but have lost bodily connection due to paralysis, limb loss, or neurodegenerative disease. BrainGate’s partner institutions include Brown, Emory, and Stanford universities, as well as the University of California at Davis, Massachusetts General Hospital, and the U.S. Department of Veterans Affairs.

Tesla CEO Elon Musk is working on a robotically implanted brain-computer interface (BCI) system through his company Neuralink, which aims to allow the brain to communicate with a computer. Neuralink is designing what it claims is the first neural implant that would let a user control a computer or mobile device. The approach is to insert micron-scale threads that contain electrodes into the areas of the brain that control movement. Each thread is connected to Neuralink’s implant, the Link.

Rao says he is not aware of any brain implants currently being used for augmentative purposes in humans to facilitate better athletic performance or for enhanced gaming skills, but the potential exists. Achieving such improvements will necessitate “much more nuanced regulations,” because once that happens, “one has to think about what this device is doing, since it is being used for enhancing the capabilities of people.”

Non-invasive devices already are being used to deliver electricity to the brain to improve sports performance, Rao says.

“It’s still very early; in the Kitty Hawk days … but it’s a good time for us to think about these devices before technology leaps ahead and it’s too late,” he says. Regulating neurotechnology must be international, he adds.

Some brain implants already have received U.S. Food and Drug Administration (FDA) approval. They include Medtronic’s Percept deep brain stimulation device, and the RNS system from Neuropace. “So we’re starting to see these devices come into the market, and legislation would have to look into the side effects: what are the implications of having these enhancements to humans,” Rao says.

Back to Top

High Expense, High Risk

Others question the possibility of implanting devices in the brain for anything other than treating medical conditions.

“If [a device is implanted], it has to be implanted by surgeons, so I think anything that is implanted is going to have to be approved as a biomedical device,” says Jennifer Chandler, a law professor at the University of Ottawa who focuses on the legal and ethical aspects of biomedical science and technology, including brain sciences.

“It’s hard to imagine someone getting something implanted to play a game,” she says.

“It seems high-risk and incredibly expensive.”

Chandler thinks neurotechnology is a field that should be closely regulated “because the risks are profound” when implanting a device in the brain. Deep brain stimulation involves drilling small holes in the skull to implant electrodes and devices. Side effects can include bleeding in the brain, stroke, infection, and breathing problems, according to the Mayo Clinic.

Hank Greely, a Stanford University law professor and director of the Stanford Center for Law and the Biosciences, says he is not aware of any U.S. regulation specific to brain implants. Most, if not all, are medical devices, and thus subject to some regulation in most countries, at least when used for medical purposes, he says.

“I don’t know of anyone who has a brain implant for non-medical reasons, though the world has enough crazy people that someone may have tried it,” Greely says.

Right now, brain implants are not being used widely enough, or outside appropriate medical or research situations enough to warrant legislation, regulation or much concern—yet, Greely says. That could change if the implantation of devices into the brain becomes much easier. “Elon Musk’s dreams are so far from reality that they need not be, at this point at least, anyone else’s nightmares,” he says.

Back to Top

Self-Regulation Over Legislation

Judy Illes, a professor of neurology and Distinguished University Scholar at Canada’s University of British Columbia, does not believe new laws are needed to regulate brain implants. Illes’ research focuses on ethical, legal, social, and policy challenges at the intersection of the brain sciences and biomedical ethics. She advocates for “professional self-regulation and introspection guided by individuals and our professional societies and organizations, over legislation.”

Laws have their place, Illes says. “What I’d like to see in the space of neuroscience and neuroethics and devices is that we work as a group to harmonize the use of implants globally, before we have to turn to legislation to regulate” neurotechnology.

Legislation tends not to be nimble, because laws take time to change, which is another issue Illes sees, in addition to the brain sciences field constantly evolving. “So why [create laws] when we can really work on the ground and create something that is immediately relevant and from the inside and respond to pace of discovery?”

“I don’t know of anyone who has a brain implant for non-medical reasons, though the world has enough crazy people that someone may have tried it.”

Dr. Patrick J. McDonald, head of neurosurgery and an associate professor at the University of Manitoba in Winnipeg, Canada, implants devices in the brain, mostly to treat epilepsy and hydrocephalus (a build-up of fluid in the cavities of the brain). He is not aware of any device-specific legislation regarding brain implants in Canada, although he says some jurisdictions have legislation relating to neurosurgery for the treatment of psychiatric illness, for which deep brain stimulation is occasionally used.

Regulation of implantable devices in Canada is done under the auspices of Health Canada, that country’s equivalent of the U.S. Food and Drug Administration (FDA), and is not governed by specific legislation, McDonald says.

A 2019 paper McDonald co-authored with Illes and others maintains that “regulatory oversight places patients, healthcare finances, and societal trust in biomedicine at risk.” The paper, “Regulatory oversights for implantable neurodevices,” also expresses concerns about a report from the International Consortium of Investigative Journalists. The authors said the report “underscored the limitations of the regulatory environment for medical devices, including an inadequate approval process, the scarcity of device registries, and a recall process that relies primarily on the device manufacturers to notify regulatory agencies of adverse events.”

That same report led to promises by the FDA and Canada’s Ministry of Health to overhaul their regulatory systems after deficiencies were discovered surrounding approvals, monitoring, and recall mechanisms, according to the McDonald/Illes paper.

However, the authors don’t feel those changes went far enough. “We are concerned about the absence of rigor in the FDA approval process for medical devices,” the authors wrote. Instead, they propose a mandatory registry for all implanted neurological devices and that any “adverse events and recalls should be required and made readily available on a publicly accessible, searchable website to ensure adequate post-market surveillance.” Physical reporting of adverse events should also be mandatory, they maintain.

Back to Top

Exploitation, Potential Hacks, and Ethical Concerns

Rao believes as brain implants become more widespread, from a medical perspective, there needs to be more research on the potential side effects of having these devices implanted in the human body, and particularly in the brain, over the long term.

“Think of it as another kind of drug. These kinds of drugs may open up certain experiences that could lead to addiction because parts of the brain could be stimulated,” just as they are by tobacco or alcohol, he says.

Rao co-authored a paper in early 2022 on the ethical and social implications of brain-computer interfaces that use artificial intelligence, known as brain co-processors. In it, he writes that like most new technologies, BCIs face significant risk of exploitation “by criminals, terrorists, commercial enterprises, or spy agencies, as well as legal, law enforcement, and military entities. Brain stimulation opens up the dangerous possibility that an unsecure device may be hijacked and used to coerce a person to perform objectionable acts, such as commit a crime or sign a legal document.”

There is potential for unprecedented abuse and malicious attacks, such as a hacker sending a virus to a device, which could result in cognitive impairment or manipulation, Rao maintains. Another risk is that “brain spyware” could add or replace legitimate components of a BCI system to extract a user’s cognitive and behavioral processes without their permission.

“It is imperative that strong legal and technological safeguards are put in place before widespread deployment of any coprocessor, for instance by considering neurosecurity during the design process itself,” according to Rao. “Activities that violate a co-processor’s security and privacy should be made illegal, with stringent punishments for breaking the law.”

Rao and other researchers have not started discussions on regulating brain implants with lawmakers. He says the FDA has expressed interest in these ideas, “but is more focused on short-term medical devices,” and the agency is not taking a long-term view on what will happen in 10, 20, or 30 years.

There is much that needs to be addressed. There needs to be international conversations about the moral aspects of regulating neurotechnology as well, he says, because some countries are more lenient in what they allow. Then a decision must be made about whether it should be left up to states or countries to regulate brain implants for non-medical purposes.

“If you have people using devices to augment themselves, what happens when a person goes from one country to another?” Rao says. Other questions include whether people with brain implants to augment cognitive or physical abilities should be given different academic or other standardized tests; should they be required to turn off the device or take a different test?

These issues will lead to challenges we have never faced before because the device is interacting with the brain, says Rao.

“In the 1950s or 60s, if you told someone you were having surgery on your nose to change your appearance, people would have said you’re crazy,” Rao says. “The same thing will happen with implants to improve how they feel or think or learn faster. These are all things that could potentially happen.”

Illes disagrees. “One really has to wonder who would choose to have a brain implant for a non-medical condition. … The problem with that conversation is it takes away from patients who are truly suffering,” she says.

“If we start worrying about the five to 10 people who think an implant-able device will make them faster or smarter and create laws because we’re spending time worrying about that, we’ll hurt people, especially the ones who are most vulnerable from brain and mental health conditions, because we’re distracted from what matters.”

*  Further Reading

Rao, R.P.N. and Schonau, A.
Brain Co-Processors: Ethical and Social Implications, 2022.

Rao, R.P.N.
Brain-Computer Interfacing: An Introduction, New York, NY: Cambridge University Press, 2013.

Rao, R.P.N.
Brain Co-Processors: Using AI to Restore and Augment Brain Function. 2020. Cornell University

Fanelli, F. and Ghezzi, D.
Transient electronics: new opportunities for implantable neurotechnology, Current Opinion in Biotechnology, Volume 72, 2021, pp. 22–28,

Schmid, A., Tokuda, T., and Ker, M.D.
Editorial: Microelectronic Implants for Central and Peripheral Nervous System: Overview of Circuit and System Technology Front. Neurosci., 19 November 2021

Errigo, M.C.
Neuroenhancement and Law, in D’Aloia A., Errigo M. (Eds) Neuroscience and Law. Springer, Cham.

IEEE Neuroethics Framework. Addressing the Ethical, Legal, Social and Cultural Implications of Neurotechnology

Dadson, A.
Thinking Out Loud: The Ethical Implications of Neural Implants.

McDonald, P.J., Lau, C., Coates McCall, I., Lipsman, N., and Illes, J.
Regulatory oversights for implantable neurodevices, The Lancet, October 2019

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More