Years ago, humans were imagining a magical world where the ability to interact with different machines through thought was possible. Most people believed this magical world as myths and fiction stories. But now, with the advancement in neuroscience and brain imaging technology a great interest has been shown by the scientists to turn this fiction into reality.
What is AI & Human Interface?
Brain-computer interface is a method of communication-based on neural activity generated by the brain and is independent of its normal output pathways of peripheral nerves and muscles. The neural activity used in BCI can be recorded using invasive or non-invasive techniques. The goal of BCI is not to determine a person’s intent by eavesdropping on brain activity, but rather to provide a new channel of output for the brain that requires voluntary adoptive control by the user. (AnirudhVallabhaneni)
A brain-computer interface, also known as a mind-machine interface, is a direct communication interface between an external device and the brain, bypassing the need for an embodiment. The signal directly goes from the brain to the computer, rather than going from the brain through the neuromuscular system to the finger on a mouse.
Types of Human & Artificial Intelligence Interface
The names invasive and non-invasive should (hopefully) be to most readers of this blog rather self-explanatory. Crudely put, invasive is any process that requires the delivery of foreign substances inside the body of the subject whose brain is meant to be interfaced, that being a human or an animal subject. These substances can range from large electrode setups to chemical molecules delivered through injection (an injected drug does count as an invasive process). On the other hand, non-invasive procedures do not require any kind of implantation and the subject gets to interface with the machine through wearable items, kind of the haute couture of the BCI world. BCIs are directed at augmenting, assisting, or repairing sensory-motor or human cognitive functions. It combines technologies from the fields of electrical engineering, computer science, biomedical engineering, and neurosurgery. (Brain Vision UK Ltd 2014).
Hans Berger’s innovation in the field of human brain research and its electrical activity has a close connection with the discovery of brain-computer interfaces. Berger is credited with the development of electroencephalography, which was a major breakthrough for humans and helped researchers record human brain activity – the electroencephalogram (EEG). (Brain Vision UK Ltd 2014)
40 years later, in the 1970s, researchers were able to develop primitive control systems based on electrical activity recorded from the head. The Pentagon’s Advanced Research Projects Agency (DARPA), the same agency involved in developing the first versions of the Internet, funded research focused on developing bionic devices that would aid soldiers.
Current BCI-based tools can aid users in communication, daily living activities, environmental control, movement, and exercise, with limited success and mostly in research settings. (AnirudhVallabhaneni).
As the artificial intelligence is progressing by leaps and bounds new horizons are being availed through this technology. When we think about the possibilities, we can generate the mind is Wonderstruck. We can save and give a feeling of being alive to the impaired and handicapped. We can save our soldiers from the excruciating pain of warfare. Human lives that can be saved and eased. Robotics is indeed the future and it is almost here. Let us discuss a few potentials uses of artificial intelligence amalgamated with the human mind.
Artificial Intelligence & Human Mind in medical sciences
Individuals who are severely disabled by disorders such as ALS, cerebral palsy, brainstem stroke, spinal cord injuries, muscular dystrophies, or chronic peripheral neuropathies might benefit from BCIs. To help determine the value of BCIs for different individuals, Wolpaw et al  suggested that potential BCI users be categorized by the extent, rather than the aetiology, of their disability. Evaluated in this way, potential HCI users fall into three reasonably distinct groups:
(1) people who have no detectable remaining useful neuromuscular control and are thus totally locked-in
(2) people who retain only a very limited capacity for neuromuscular control such as weak eye-movements or a slight muscle twitch; and
(3) people who still retain substantial neuromuscular control and can readily use conventional muscle-based assistive communication technology. (Joseph N. Mak, Member, IEEE and Jonathan R. Wolpaw)
Artificial intelligence coupled with the human mind has the following medical applications:
- allow paralyzed people to control prosthetic limbs with their mind
- transmit visual images to the mind of a blind person, allowing them to see
- transmit auditory data to the mind of a deaf person, allowing them to hear
- allow gamers to control video games with their minds
- allow a mute person to have their thoughts displayed and spoken by a computer
Artificial Intelligence & Human mind coordination for warfare
Military interest in mind-control is nothing new. The US Defense Advanced Research Projects Agency (DARPA) has been funding a variety of BCI projects since the early 1970s – with other nations running similar programmes too – and the technology has come a long way in those intervening 40 years. (Gareth Evans 2013).
Mind Reading in the Military
“The US Defense Advanced Research Projects Agency (DARPA) has been funding a variety of BCI projects since the early 1970s.” The principles behind BCI could hardly be simpler – sensors detect the electrical signals of the user’s brain, and they are subsequently rendered into a computer-usable form; actually achieving such an interface successfully is, however, rather less straightforward.
Telepresence – allowing a human operator to have an at-a-distance presence in a remote environment via a brain-actuated robot – is a relatively new field of research, most of the work to date having been done with operator and robot in close proximity to each other. Controlling mobile robot agents is one area where BCI appears to hold much promise, and perhaps most notably with its potential to revolutionise reconnaissance in hostile terrain, which was originally the primary function of aerial drones before they were widely weaponised.
Augmenting human strength and endurance with wearable suits has been an area of military research for decades, and the technology has come a long way since the first rather crude and clunky attempts, Said to be robust enough to enable its wearer to lift 200lbs loads – or punch through three inches of wood – repeatedly, yet agile and responsive enough for stairs or ramps to be negotiated with ease.
BCI also has major potential military applications beyond the field of robotics – and improved communications are high on the list. DARPA’s ongoing project ‘Silent Talk’ is another idea that has distinctly sci-fi overtones and aims to allow front-line soldiers to communicate telepathically.
A few other uses of the said technology can be as follows:
- Alcoholic drivers, as a contributor to road accidents, could also be characterized through the use of EEG signals as mentioned
- Security systems involve knowledge-based, object-based and/or biometrics-based authentication. They have shown to be vulnerable to several drawbacks such as simple insecure password, shoulder surfing, theft crime, and cancelable biometrics . Cognitive Biometrics or electrophysiology, where only modalities using biosignals (such as brain signals) are used as sources of identity information, gives a solution for those vulnerabilities
- The major advantage is that it makes you appear smarter and definitely faster at solving various sorts of mental problems. It might also, perhaps, be useful for controlling complex systems (such as machinery). This would, at the least, make you a better employee candidate in many vocations.
Now even though the new technology is and can change lives but every invention or technology has its pros and cons. Yin and yan. Do’s and don’ts. Balance and imbalance. We cannot turn a blind eye to the fact that this technology can be used for a negative purpose rather than optimum usage. Warfare usage is certainly a negative one. Spying and destruction are major drawbacks of this interface.
We have discussed human and artificial intelligence interface. It’s types along with usages whilst accepting the dark side. But is this it? I have a notion that it is not. Technology grows and develops. It is a continuous process. Especially artificial intelligence which builds itself and directs the researcher for further prospects. We have seen many things that were fictitious in the last century, but they were made true. This is something that is also possible and a great advancement. I chose the topic as I am a firm believer that technology can make a difference. I look forward to the day when we will be a progressive nation with opportunities to not only discuss but create and contribute to technological advancement.
About Writer: Nasir Iqbal is a student of Business and Operations Management at Forman Christian College University.