Regulating Neural Data

by James Cavuoto, editor

The issue of neurotech privacy is one that has attracted considerable attention from the general public recently. Long-time readers of this publication will recall that we have advocated for protecting users’ neural data from the outset in this space.

The issue has taken on more immediacy this month, when the state of Colorado became the first to pass legislation that specifically protects neural data gathered from consumer neurotech devices. The new measure, HB24-1058, expanded the definition of “sensitive data” within the Colorado Privacy Act to encompass “biological data,” including neural data. The bill passed the full House on a vote of 61-1. The Colorado Medical Society provided expert professional validation prior to the legislative process and strategic resources throughout.

Also this month, California State Senator Josh Becker’s SB 1223 was discussed in a hearing of the Senate Judiciary Committee and then adopted on a vote of 11-0, putting California on the path to extending protections of the California Consumer Privacy Act to cover neural data.

These legislative actions, and the widespread bipartisan support they have gained, speak to the growing awareness of neurotechnology among the media and the general public. It is also a reflection of the work of authors such as Nita Farahany from Duke University, and new nonprofit organizations such as the NeuroRights Foundation. That organization has recently released a report entitled “Safeguarding Brain Data: Assessing the Privacy Practices of Consumer Neurotechnology Companies.” In the report, the authors examined the privacy policies of 30 companies that offer consumer neurotechnology products. They found that 29 out of the 30 appear to have access to consumers’ neural data and provide no meaningful limitations to this access. They also pointed out shortcomings in providing customers with adequate information about their policies. More than half of the firms have explicit policies that allow them to share neural data with third parties.

Still, it is noteworthy that the organization is not opposed to the sale of consumer neurotech products, they just want to see more care placed on the use of the neural data obtained from those products. “As neurotechnology devices proliferate beyond medical settings outside the strict requirements for medical devices and health privacy, it is critical that consumers comprehend exactly how companies can use their neural data and what rights they have over that usage,” the authors write. “Without this information, consumers cannot make meaningfully informed choices about their privacy, and they may unwittingly expose their most sensitive data.”

Another new organization in the U.K., The Institute of Neurotechnology and Law, has taken up the issue of neural data privacy. “The intersection of technology and human cognition offers vast opportunities but also unprecedented challenges,” said founder Harry Lambert. “Navigating this terrain will require foresight, flexibility, and firm commitment to ethical principles to ensure that neurotechnology enhances human capabilities without compromising human dignity or autonomy. As this field continues to evolve, robust debate and thoughtful legislation will be crucial in steering the direct trajectory of neurotechnological advancement for the benefit of all humanity.”

While government efforts to protect users’ neural data from consumer neurotech firms is certainly reasonable, we continue to advocate for more protection of our brain signals from the prying eyes of the government itself. As we pointed out in the second issue of this publication 23 years ago, the gentlemen who authored the fourth amendment to the U.S. constitution may not have foreseen the ability of neurotech sensing devices to pierce the veil of human cognition, but they most certainly would have found such intimate search patently unreasonable.