Where free will, mental privacy inform protection of 'neuro rights'

作者:Cheng Yu来源:China Daily
分享

A visitor experiences the brain-computer interface technology from Tianjin University at the seventh World Intelligence Congress in Tianjin. [Photo provided to chinadaily.com.cn]

Both China and the United States are driving the brain control tech, giving the sci-fi-like field dimensions of reality. Not surprisingly, there are concerns on who will protect the extremely sensitive data that will emerge on consumers' brain functions and responses.

A potential nightmare scenario, for example, could be where someone exploits a security loophole in brain-computer interface products and steals highly personal data, while also inserting harmful malware in the system.

What if BCI users end up with a device that doesn't function or, worse, makes them vulnerable to mind-reading, or even mind-control, by others?

Experts have therefore stressed that security and ethical issues should be dealt with before going full speed ahead with wide-ranging applications and commercialization of the BCI tech.

China appears to be aware of the overall situation and has already initiated action on how to regulate the emerging industry to ensure its healthy development. It has issued a guidance earlier this year on BCI research and a guide on the conduct of research and risk prevention in research and applications of the technology.

The guidance, developed by the Artificial Intelligence Ethics Subcommittee of the National Science and Technology Ethics Committee, outlined six basic principles for researchers and institutions.

The first principle emphasized that BCI research should be "moderate and harmless", with the fundamental purpose of "assisting, enhancing, repairing human's sensory and motor functions, or improving human-computer interaction capabilities, to enhance human health and welfare".

Research should ensure conduct of ethical and data security reviews of plans and results and promote risk monitoring throughout the process to protect participants' safety, privacy, data security and legal rights.

Hexi Yujin, partner and senior vice-president of BrainCo, a Harvard University-backed BCI startup, said: "In terms of data protection, companies now obtain data with the informed consent of users and all data have been desensitized, which means key personal data have been wiped out.

"Companies are also regulated to only use such data for modeling to satisfy users and cannot use the data for any other purposes."

Domain experts stress that while the frontier technology offers unprecedented benefits to both consumers and businesses, it is equally important that it shouldn't be abused for commercial benefits.

Currently, the BCI technology can only recognize specific instructions and cannot yet "read minds". Duan Weiwen, a senior researcher at the Chinese Academy of Social Sciences, said: "If, one day, the brain-computer interface develops to the level of deep interaction with the brain, then we must define what can be done and what cannot be done.

"Therefore, it's very necessary to prompt stakeholder groups to conduct extensive discussions on issues such as neural data and mental privacy that may be involved in brain-computer interfaces, and to formulate corresponding technical standards, norms and ethical principles, and establish dedicated supervision, compliance and ethical review mechanisms."

At the end of the day, if BCI technology can really help read minds, maybe the world will regard "neuro rights" as basic human rights and advocate the inclusion of free will and mental privacy within the scope of human rights protection.

分享