The Pursuit of Intelligent Machines: Part 5 – The Human-Machine Interface

Remember the Six Million Dollar Man? Colonel Steve Austin, an astronaut, crashes a test plane and is badly injured. Scientists and physicians reconstruct him using bionic parts. This TV series that aired from 1974 to 1978 speaks of a future when humans and machines become one.  The Six Million Dollar Man soon begat The Bionic Woman, ensuring that not only the male species but also females could be half human, half machine. The premise of the Robocop movies is similar, marrying machine technology with a badly injured human to create a cyborg. Not to be outdone, Star Trek introduced an entire galactic cyborg species called The Borg, a culture that acquired intelligent humanoid civilizations and integrated them into a collective through the implanting of nano devices, body parts with specialized tool appendages, and linked them together through a hive mind.

Is this our 21st century future, a dehumanizing of our species as we integrate machine parts to replace our mortal ones? This posting is sub-titled, The Human-Machine Interface. Technically a human-machine interface is defined as the means by which a person interacts with a machine. For example the mouse and keyboard on your computer are a human-machine interface. A joystick is a video game human-machine interface. A steering wheel on an automobile is a human-machine interface. That is not the kind of interface we are going to explore in this discussion.

On July 17, 2008, scientists got together for a conference in London, England, hosted by Dr. Nick Bostrom, Director of Oxford’s Future of Humanity Institute.  In looking at the future of technology and our species conference scientists discussed biotechnology, molecular nanotechnology, and artificial intelligence and the potential usage of these technologies to improve humanity physically, emotionally and intellectually. This combination both human and technology would lead to a new form of post-human life according to Bostrom. “We will begin to….manage our own human biology….the changes will be faster and more profound than the very, very slow changes that would occur over tens of thousands of years as a result of natural selection and biological evolution.”

The futurist, Dr. Ray Kurzweil believes that “this will happen faster than people realize.” He goes on to predict “the culmination of the merger of our biological thinking and existence with our technology, resulting in a world that is still human but that transcends our biological roots.” Forever an optimal futurist, Kurzweil has been equally right and wrong in his predictions since he began his inventive pursuits in the latter part of the 20th century. But his ultimate prediction is based on some pretty hard science and technology that is rapidly evolving today.

Kurzweil talks about an approaching singularity, a point in time when we as humans will become interchangeable with the machine technologies we have created. Immortality will be the outcome with our biological entities morphing into the technological world. He predicts we will have the capability to upload our minds to the Internet. We will be capable of doing this because the pace of electronic evolution is such that machine memory and processor speeds will dwarf our human brain. When you consider the evolution of the transistor and  microprocessors, Moore’s Law, the development of biological computing, and neurogrid computing, Kurzweil’s predictions of machine intelligence surpassing our human brain within the next 30 years seems quite reasonable. He prophecies, “we will increasingly become software entities.”

The Biotechnology Revolution

Biotechnology will be one of the driving forces behind this evolution to a human-machine merger and we will be exploring this subject in greater depth in future postings. But it is clear today that medical research is giving us a better understanding of the way DNA and RNA operate. We are starting to see gene therapy emerge as a way of treating intractable diseases. Just look at three very recent headlines to get a sense of the rapid progress being made by scientists.

Gene Therapy May Stall Inherited Emphysema

Gene Therapy and Stem Cells Save Limb

In New Way to Edit DNA, Hope for Treating Disease

We are entering a medical age where we can target specific diseases with cures that manipulate the very basic constructs of life itself, our DNA and RNA. We are beginning to move out of the realm of hit and miss medicine and through gene manipulation and therapy we can begin to reverse the onset of disease and aging.

The Nanotechnology Revolution

Nanotechnology is another driving force behind the merging of humans and machines. Today one can subscribe to a journal called Nanomedicine. In this journal you can read about research on using nanobots and nano-particles for diagnostic and interventional medical procedures as well as for creating cures for diseases. Nanotechnology in medicine is being used to deal with appetite control, bone and blood replacement, cancer, diabetes and more. You can view a good list of current nanomedicine applications by clicking on the link provided.

We are today in hot pursuit of creating the first nanobots or nano robots that can navigate through our blood stream like the submarine of Isaac Asimov’s Fantastic Voyage.  These include:

  • respirocytes, a robotic red blood cell
  • chromallocytes, mobile cell-repair nanobots that can be inserted into the body on replacement therapy missions targeting specific tissues or organs
  • microbivores, nanobots designed to destroy pathogens in the blood

Kurzweil states, “Nanotechnology will not just be used to reprogram but to transcend biology and go beyond its limitations by merging with non-biological systems.”

The Artificial Intelligence Revolution

Artificial intelligence is gaining rapidly on our human intelligence.  As background I encourage you to read my previous posting, When will computers become more human. Dr. Bostrom writes about the evolution of artificial super intelligence on his blog site. As we develop computer technology that is capable of being smarter than us and that can enhance its own intelligence, we get into the science fiction that movies like The Matrix portray. Of course The Matrix describes the dark side of such a super intelligence.

There is a far more promising line of research which combines artificial intelligence and biology to create the ability for humans to manipulate a cursor on a computer screen, or control the electronics and motor controls of an artificial limb using thoughts. Today the military is spending significant sums on developing these types of human-machine interfaces.

Speculating further one can envision the ability of humans to interface with avatars through the Internet or its successors experiencing virtual or real worlds where physically humans cannot go. Just such a scenario is portrayed in James Cameron’s new movie, Avatar, where a human interfaces with an artificial life created to survive on a planet that is inhospitable to our species.

Superintelligence through artificial means may go an entirely different route if current research into Parkinson’s Disease is evident. Today we are using nanotechnology to implant devices into the brain to not only treat Parkinson’s but other neurological diseases such as epilepsy and depression. These implants stimulate neurons to supplant the brain’s own signals or record actual brain signals and reroute them. Such technology has the potential to be used to enhance human intelligence by artificial means.

The Pursuit of Intelligent Machines – Part 1: Laying the Foundation for the 21st Century

The computing revolution described in previous blogs has as its natural outcome the potential to develop cyber intelligence that resides within a machine. Whether these machines are used to assemble material goods or are robots that roughly resemble human or other animal forms, it is clear that in the 21st century we are on a path towards this conclusion.

To understand the development of machine intelligence let’s do a quick historical review. The roots of our industrial revolution began with the creation of machines that didn’t require human power to run. Other than human or animal muscle, running water and wind were the power source for most machines prior to the 18th century. The harnessing of steam began the revolution that led to industrial manufacturing on a global scale in the 19th and 20th centuries.

What kind of machinery are we talking about? Mechanized processes powered by steam dominated the beginnings of what we call the Industrial Revolution. Steam engines had their start in the last few years of the 17th century but came into usage by the middle of the 18th largely for use in the mining industry. Constant innovation in steam technology led to significant power breakthroughs by the end of the 18th. James Watt and Matthew Boulton perfected steam engines so that they could be used for general purposes, whether to pump water, or drive the rotary machinery of a factory or a textile mill. By the early 19th century steam engines became mobile leading to the building of steam boats and railway locomotives.

Two other developments paralleled the perfecting of steam power. These were the use of coal as a ready source of energy and the development of machine shops for the creation of machine tools. Machine tools have their origin in the world of watchmakers, the first assembly line batch producers of wheels, cogs and gears, the clock work mechanisms that we are familiar with to this day. The 19th century wars in Europe, nationalism, and the competitive pursuit of colonies played a considerable role in the perfecting of machine tools with numerous inventions of devices for the creation of interchangeable parts being the outcome. The revolution in transportation technology was a further driving factor in the creation of machine inventions. From 1860 to 1910 precision metal-working machine tools led to standardization of mass production on a global scale.

By the mid-20th century a new element had been added to machinery and manufacturing, numerical control, the automating of machine tools through the encoding of programmed instructions contained on some form of storage medium. The first numerical control manufacturing systems used punch cards to drive instructions to machines to generate metal cutouts. By creating a programmed series of instructions driven by the punch cards the manufacturing equipment could precisely cut the metal into appropriate shapes without human intervention. Punch cards were soon replaced by tape in which precision instructions were represented by punched holes. The machine would read the tape and execute the program. It was only a matter of time before hand-punched numerical control tape mechanisms were replaced by computer controls and the development of machine languages to create precise instructions for manufacturing processes.

Computer-aided design and computer numerical control have been with us ever since. These technologies are perfect for the mass production of similar products. This is what the 20th century was all about. But the 21st century is going down a new path where computer-aided design is focused on new consumption patterns leading to mass customization. We’ll talk about that in Part 2, our next blog.