Biomedicine Update – Healing the Heart

The expression “in a heartbeat” may not have the same meaning in the future if technology under development has a say. Conventional thinking on replacing the heart with a device that works the same way as the heart is being turned on its head by devices that don’t beat at all. These new artificial hearts are turbines that push blood through the body, not in spurts but in a steady stream. Called continuous-flow devices they have been under development for the last two decades.

The model for continuous-flow is Archimedes’ screw, the 3rd century B.C. device designed to pump water from the Nile River to the fields that lined its banks. Archimedes’ screw is still in use in Egypt today and Richard Wampler, a surgeon and engineer, first encountered them in a trip to Egypt in 1976. A decade later he patented a device based on this ancient technology but instead of moving water his was designed to move blood.

In use since the 3rd century B.C., Archimedes' screw is the technology behind new artificial hearts.

Initially the device was used to assist failing and post-surgical left ventricles (the primary pumping chamber for delivering blood through the ascending and descending aorta to the body). Before Wampler’s device left ventricle assist devices or LVADs required external compressors and pulsed like the heart. Wampler’s device didn’t. Many surgeons and cardiologists were skeptics. They were concerned that blood passing through a turbine would be damaged. But the continuous-flow turbines did no damage to blood cells when implanted in test animals.

In November 2003 a commercial version of the pump, called the HeartMate II was implanted in a young patient from Central America. He left the hospital and didn’t return. The young man’s heart prior to the surgery was weakened but still functioning. During the eight months he was away before coming back for a followup examination his heart pretty much stopped functioning. When examined he had no pulse and nothing that one would recognize as a heartbeat. But he was walking around and functioning and described himself as “feeling fine.”

The continuous-flow artificial heart consists of twin turbines designed to circulate blood throughout the body. Unlike prior artificial hearts, it doesn't try to emulate the natural heart. It doesn't beat. It doesn't create a pulse in its recipients. Source: Popular Science

Since 2008 HeartMate II has been installed in 11,000 people worldwide. This device, an enhanced LVAD, has had a wonderful side effect in many patients. It reverses heart failure confounding cardiologists who assumed that any damage to the heart was permanent.

From LVAD to a continuous-flow artificial heart implanted in humans is a matter of less than a decade according to the developers of this technology and the medical teams that are implanting them. Currently these new devices are undergoing animal clinical trials with encouraging results. It is only a matter of time, not in a heartbeat, but certainly within a decade before artificial hearts will be commercially available for humans.

Biomedicine – Part 8: Robotic Exoskeletons

Animals come in many shapes and forms. Insects and other arthropods share a common physical attribute. They wear their skeletons on the outside. We call them invertebrates. Humans and other mammals, birds, reptiles, amphibians and fish, called vertebrates, have an internal or endoskeleton. In this blog we explore the fusing of internal skeleton-based biology with external robot-based outer skeletons or exoskeletons to give mobility and strength to humans suffering from catastrophic injuries and diseases that impair mobility.

The Necessity and History of Human Exoskeleton Technology

Our earliest forays into exoskeleton technology had nothing to do with restoring movement to a human suffering from an injury or disease. We began with the goal of protecting warriors in battle as we found ourselves often in conflict with neighbours. An escalating arms race included not just weapons but also defences. We invented shields, breastplates, chain mail, and full body armor to counter increasingly sophisticated weapons.

Medieval warriors wore an armor exoskeleton to protect them from a variety of weapons.

The medieval iron-plated suit, seen above, protected warriors but also encumbered them. When suited and on horseback a warrior was mobile and capable of using lance, mace and broad sword to attack enemies. When out of the saddle a warrior became almost helpless, with walking a chore, let alone fighting.

We have yet to end warfare and our fascination for finding new methods of putting armor on soldiers continues. Today’s army wears synthetic-fibers such as Kevlar, stronger than steel, lighter, and malleable to conform to a body. Soldiers encased in these materials encounter heat stress when worn over a long period because the materials do not breathe. To counter the heat buildup of modern body armor researchers are experimenting with personal cooling systems using phase change materials or PCMs, acting as heat sinks to cool  the wearer.

A more recent phase in research is studying how wearable robots can enhance the strength and capability of soldiers. Scientists studying the attributes of arthropod exoskeletons have always been fascinated by the strength exhibited in these animals. For example an ant can carry objects 10 to 50 times its own weight. Could a soldier outfitted with an exoskeleton exhibit similar capability?

Exoskeletons and Military Research

Hardiman, designed by General Electric in 1966, proved to be a balky first attempt at building a powered exoskeleton capable of giving its human operator superhuman strength. Wearing Hardiman (seen below) its operator could lift weights of up to 680 kg. (1,500 lbs).

Hardiman weighed 680 kg and allowed its human operator to lift objects equivalent to its own weight.

But Hardiman suffered from lots of technical problems. It was heavy, its hydraulics and electro-mechanical systems frequently failed, and it used heavy inefficient batteries.

At about the same time Project PITMAN, out of the Los Alamos Laboratories, began long-term development of an exoskeleton prototype suitable for military field deployment. What the army sought was a powered, armored external apparatus easily worn by a soldier in the field. The suit would feature invulnerability to gunfire and its wearer would be able to easily lift heavy equipment and weapons, and carry wounded soldiers from the field of battle. The suit would give its operator the ability to jump easily and move at a faster than normal human pace.

Research in the last twenty years continues to focus on prototype designs for wearable exoskeletons. One company, Sarcos, recently acquired by Raytheon, has made considerable progress with its XOS robotic suits. The current model, XOS 2 gives its wearer the strength and endurance of two or three soldiers. Consisting of sensors, actuators and controllers, and powered by high-pressure hydraulics, XOS 2 is operator friendly. A wearer can step into it and immediately augment his or her ability to lift 90 kg (200 lbs) repeatedly, and run at a speed of 16 kilometers per hour (10 mph) without tiring. At the same time the wearer climbs ramps and stairs with agility, kicks a soccer ball, handles and passes a basketball, or punches a boxing bag  repeatedly without fatigue. The wearer of a XOS 2  can step out at anytime as it waits or operates autonomously on an assigned task.

XOS 2 is a robot exoskeleton that enhances the ability and strength of its wearer. Source: Raytheon Company

Lockheed Martin is developing HULC(tm), the Human Universal Load Carrier, a lower torso exoskeleton designed to allow soldiers in the field to carry loads of up to 90 kg (200 lbs) for extended periods of time across all types of terrain without feeling fatigue. Click on the link above to see HULC in action. HULC uses a titanium exoskeleton, a computer controller with hydraulics and servomotors, and allows the field operator to snap in modular components such as a lifting frame for hands-free assistance. The wearer can travel at speeds up to 16 kilometers per hour (10 mph). It is easy to put on and take off in battlefield conditions. Currently the lithium-ion battery gives it limited field range. The company is working with Protonex to develop a fuel-cell power source to give it 72 hours of continuous power.

The Biomedical Payoff

Military research in the United States has always led to civilian applications. It is no different when discussing exoskeleton technology. Immediate benefits include enabling people with spinal injuries or neuromuscular diseases to leave their wheelchairs, stand and walk without help.

Ekso Bionics Engineering, a company located in California, is pursuing that goal. Its product, Ekso, offers an escape for wheelchair users through its lower torso exoskeleton. Ekso contains 40 electromechanical servo-motors, operating through intelligent networked software and giving its operator a natural walking gait. An Ekso wearer can sit, stand and walk. At $100,000 the device represents an expensive solution but the company expects the price to decline by half in the near future while it adds many more functions and features. The goal for Ekso is a wearable device that the operator puts on in the morning and uses all day to function in a normal way including going up and down stairs in a home or driving the car to work, or going to a baseball game or a movie at night.

Ekso uses 4 electric motors, an onboard computer and 15 sensors to drive it. Source: Ekso Bionics

Other companies are pursuing a similar goal. One of them is Cyberdyne, a company headquartered in Tsukuba, Japan, and the creator of the HAL 5 Hybrid Assistive Limb. HAL 5 features a wearable exoskeleton, battery-operated and usable for over 2 hours between recharges. Unlike Ekso, this exoskeleton provides robotic assistance for upper and lower body. HAL 5 takes it cues from nerve signals transmitted from the brain. The wearer controls it by thinking about an action. Sensors attached to the skin pick up the nerve impulses and translate them into movement commands such as standing and walking. The robot’s voluntary control system combines with its autonomous control system to create a full range of movement capability powering both upper and lower limbs.

HAL 5 gives us some insight into exoskeleton design in the near future. Neuroprosthetics that capture electrical signals from the brain will make it possible for seamless integration of exoskeletons with their wearers. As we improve the human-robotic interface we will largely eliminate motor disabilities.

Biomedicine – Part 8: Robots to the Rescue – Robots that work on the inside

In our last blog we introduced HeartLander, a device that when inserted into the chest cavity can deliver medication, ablation therapy and provide assistance in lead placement for pacing the heart muscle. HeartLander’s developers hope to shrink it to 3 millimetres from its current size, 8.5 mm. Devices of this type represent the start of a new use of robotic devices built to operate autonomously within a human body. We’ll look at where the technology is today and what we can expect in the near future.

In the world of internal medical robots HeartLander is a giant. Researchers have much smaller in mind when they conjure up micro-robot designs. These researchers develop robots on a nanoscale. We have broached this subject before in a blog and stated at the time of writing that there were no existing biomedical nanobots. That remains true but researchers at ETH Zürich have been building what they call micro-robots that are bacterial-scaled measuring lengths of between 5 and 15 nanometers.  Called Artificial Bacterial Flagella or ABF for short, these devices swim like bacteria with corkscrew tails.

ETH Zürich are building micro-robots as small as bacteria, observable only under a microscope. Source: Institute of Robotics and Intelligent Systems/ETH Zürich

Made by depositing vaporized indium, gallium, arsenic and chromium onto a surface substrate a few atoms thick, the ABFs are patterned using lithography and etching. When thin sliced they naturally form the curled ribbon corkscrew you see in the picture above. The ABF head seen on the right contains a tri-layer film of chromium, nickel and gold. Nickel’s magnetic properties make it possible to use an external magnetic field to provide locomotion. The ABF uses helical propulsion (the same method of locomotion used by many bacterium) to swim through liquid at speeds of up to 20 nanometers at present with plans to increase it to 100 nanometers. Compare that to E. coli which swims at 30 nanometers per second.

To give ABFs autonomous power researchers are looking at thin-film rechargeable batteries. Currently these batteries can be manufactured and shaped with thicknesses of less than 50 nanometers. Chemical fuel sources may prove to be even more promising. An ABF running on energy it harvests from the blood would be capable of running indefinitely within a body harvesting glucose and oxygen to provide motive and computing power. Some researchers are looking at using bacteria as an ABF, modifying a living cell by attaching nanoparticles to it so that it can be mobilized for biomedical purposes.

What can ABFs do that current biomedical technology cannot?

  • Targeted delivery of drugs to a specific area in the body reducing the total risk to the body of side effects. This kind of therapy can even be sub-cellular, delivering medicine to alter genes within a chromosome.
  • Placement of radioactive seeds near tumor cells. Called brachytherapy, targeted radiation therapy delivers a killing dose only to selected cells, leaving healthy cells alone.
  • Delivering heat therapy called thermoablation to selected cells to destroy them without damaging healthy surrounding tissue. The ABF’s magnetic properties would prove useful for this type of therapy.
  • Implanting of stem cells using the ABF as the carrier. Stem cells could then be delivered to an area of the body to regenerate hearing, site, organs and bones.
  • Collecting tissue samples for biopsy. The ABF would excise a small sample and when excreted or removed from the body allow for quick on-the-spot analysis to determine if cancer is present.
  • ABFs could be used as scaffolding material similar to the way stents are used today. But the ABF scaffolding would be on a nanoscale to act as cellular building blocks for regrowing blood vessels, nerves and organs. As small as today’s stents are, an ADF could act as a stent in the smallest of blood vessels, or a group of them could be combined to form a larger scaffold.
  • ABFs could be used to block or occlude blood vessels feeding a tumor causing the tumor to starve and die.
  • ABFs with specialized implanted electrodes could be used to restore severed nerves in a spinal column or provide neural connections to reverse brain damage.
  • Remote sensing represents one of the most promising uses of ABFs. An ABF could be packed with instrumentation to measure blood-oxygen levels, arterial flow, blood pressure and other vital signs instantly transmitting the results to external monitors.
  • ABFs introduced into a patient suffering from multiple injuries could hover at sites where organ damage or  internal bleeding has been identified and constantly provide status updates to physicians dealing with the emergency.
  • Fetal surgery represents a promising field for the use of ABFs, providing doctors with the means to correct congenital heart defects in utero, or performing ablation therapy to deal with congenital malformations, or clearing obstructions in blood vessels and the urinary tract, or replacing needles to collect samples for amniocentesis and other fetal tests

Currently ABFs are confined to the research laboratory. But soon we will microscopically see these devices applied to all kinds of medical procedures reducing the need for surgical procedures, minimizing the side affects of cancer chemotherapy, and giving biomedical professionals a new set of tools for tackling conditions for which current medical practice has few solutions.

For those developing this technology it’s a question of tinkering with the power source to come up with the best way to deploy micro-robots.

Biomedicine – Part 4: The Evolution of the Human-Machine Duopoly

Biomechanics is the biological equivalent of a human-computer duopoly. A duopoly is normally a marketing term describing a situation in which there are only two sellers. So I’m taking poetic license in using the term and doing so to describe what is both a convergence of humanity and machines as well as symbiosis.

In the world of biology, the human-machine duopoly can be equated with biomechanics and bioelectronics. When you think biomechanics think prosthetic devices. When you think bioelectronics think about imprintable circuits that can be applied to or injected into the body to monitor or assist in maintaining health. The two can be combined to make smart prosthetic devices. It will become harder and harder to separate the biological from the electromechanical.

Prosthetics

In times of war it seems we make the greatest advances in creating replacement parts for human limbs lost.  We have been doing this as far back as 484 BCE. Herodotus in “The Histories” speaks of a Persian solder who replaced a severed foot with one made from wood. Other historic finds have dug up limbs made of copper later to be replaced in the Renaissance by iron prostheses.  In the 16th century Ambroise Pare, a French surgeon invented both upper and lower body prosthetics. The arm prosthetic featured a hand-operated by springs while the leg included an articulated knee joint.

Ambroise Pare, in the 16th century, developed some of the earliest known biomechanical prostheses

Pare is known for a number of other battlefield innovations including ligatures, field bandages, and surgical instruments that were the precursors to those used today in operating rooms.

The American Civil War led to surgical advances and experimentation with prostheses. Chloroform as an anesthetic made it possible for surgeons to amputate and prepare leg and arm stumps to better fit prosthetic replacements. The first documented amputee of that war, James Edward Hanger, was so dissatisfied by the prosthetic wooden leg he received after losing his to a Union cannonball, that he designed and constructed a new one from barrel staves, rubber and wood creating hinges at the knee and foot. Soon after Hanger was awarded a patent for his artificial limb.

Hanger Prosthetics owes its origins to the first documented amputee of the American Civil War.

Today Hanger Prosthetics and Orthotics is a major manufacturer of artificial  replacement parts for humans and even dolphins.

In World War 1 prostheses didn’t just replace lost limbs. Soldiers who suffered from facial disfigurements received lightweight metal and rubber replacements that were often painted while being worn to match skin colouring. War veterans could buy prosthetic limbs advertised in local newspapers. Prostheses made from lightweight metals such as aluminum prevailed replacing wood, iron and steel. World War 2 saw the introduction of plastics and glass fibres. Post war research led to the development of mechanical arms with hooked ends that could be opened or closed using shoulder muscles. The Vietnam War inspired further advances in replacing lost limbs introducing electronic controls.

Our ingenuity to replace damaged body parts goes well beyond appendages and face masks. Cosmetic prosthetics have included dentures to replace decayed teeth and artificial eyes. Unlike dentures which served adequately to replace teeth, artificial eyes could not replace eyesight, at least not until recently.

Bionics: More than Just artificial Arms and Legs

In 1974 American television was introduced to Steve Austin, The Six Million Dollar Man, a test pilot and astronaut who after an accident was transformed into a fusion of human and machine. Austin may have been a fictional creation but the science of bionics is very real with the creation of artificial systems to replace natural ones. Our sensory organs, in particular the ear has been subject to all kinds of bioengineering inventiveness.

Today, of all our senses, we have made the greatest advances in creating artificially assisted hearing.  From Alexander Graham Bell’s earliest forays to the era of the transistor and computer chip making it possible for wearable hearing aids, the technology largely focused on creating amplified sound. Hearing aids have gotten progressively smaller and more capable in dealing with sound quality and background noise cancellation. But essentially none of these devices are capable of helping humans who are totally deaf.

The advent of the cochlear implant, a prosthetic device designed to compensate for sensorineural hearing loss, represents a major technological advance in prosthetic technology. The first attempts at implantable hearing devices date back to 1961. In 1984 commercial versions of cochlear implants became available for surgical implantation.

Cochlear Implants marry technology directly to sensory nerves in the inner ear. Source: The Mayo Clinic

The device connects directly to the auditory nerve, bypassing the inner ear. Part of the device containing a receiver, a magnet, wires and an electrode array, is implanted subcutaneously.  A digital signal processor with cable and microphone are placed externally in the area of the ear. The external microphone picks up sound which is converted to an electrical current by the digital signal processor . The electrical signal is transmitted to the subcutaneous receiver which then sends the signal to the inner ear auditory nerve fibres which pass them along to the sound receptors in the brain. Advances in digital signal processing technology will further improve cochlear implants leading to better speech processing and greater range of discrimination in sound frequencies.

Touch Bionics is a company developing fully articulated myoelectric prosthetic devices. Fully articulated myoelectric means the prosthetic is active, capable of a range of functionality that approximates a natural limb or appendage including very fine dexterity. The company has developed the i-limb(TM) ultra with biosim control software. With this prosthetic hand the wearer can type, pinch, grip objects from marbles to eggs, apply variable pressure, shake hands, and tie shoelaces. The hand works by picking up signals from electrodes placed on the surface of the associated arm and translating those to the prosthetic. Software lets the wearer further customize the range of activities, monitor myoelectric impulses and do training.

Retooling the Human Body: Imprinting, Biosensors, Bioelectronics and Implants

We started pacing the heart with implantable devices in 1958. With the shrinking of transistors, the development of integrated circuits and now micro-circuitry at an atomic level we are marrying electronics and biology to work together.

Imprintables

Researchers at the University of Illinois have developed electronic devices to imprint on skin as temporary tattoos. These sensory devices work unobtrusively recording medical vital signs. Each device is protected by a rubbery silicone film and stretches with the skin. The electronics includes light-emitting diodes, solar cells for autonomous power, transistors and wireless connectivity to uplink data to a monitoring computer.  A sensor can be programmed to pick up heartbeats, muscle activity and brainwaves. It can also be used as a computer controller making it useful for studying body movement. Some have even suggested its use as a game controller.

Implantables

Implanting a skin sensor is one thing. What about implanting a device that combines biological and artificial materials to become part of a person’s body. These 21st century inventions will include devices like heart valves that grow with the child, or artificial ears, and eyes. We’ll talk more about implantables when we explore nanotechnology in biomedicine in a future blog.

Biosensors, Bioelectronics

Biosensors are devices that incorporate biological or biologically mimicking materials such as engineered proteins or synthetic receptors. This is technology at an atomic scale. Such devices integrate microsystems technology using a variety of energy sources to provide power. For example, a biosensor with bioelectronics could use the chemistry of the bloodstream or its fluid dynamics to allow it to steer and manoeuvre through the body. Or it could be accompanied by a biological fuel cell and navigate autonomously. Biosensors are designed to broadcast to a receiver and can be used to monitor all kinds of vital body activity.

Currently biosensors are being used to study DNA and protein in cells. Future application of this technology include better methods for measuring biological systems. Here are near future applications for the convergence of humans and machines at a micro level:

  • Artificial pancreas to regenerate insulin production in the body and end diabetes
  • Neural implants to correct loss of vision and hearing
  • Medical drug and sera delivery systems for disease prevention or for curing an existing condition
  • Repair of spinal cord or brain injuries through neuron repair or implantable nerve chips
  • A lab-on-a-chip for continuous monitoring of vital functions

The National Institute of Standards and Technology (NIST) roadmap for biosensor and bioelectronic development sees much of this happening before 2030.

Biomedicine – Part 3a: A Postscript on Medical Simulation and Virtual Reality

In Part 3 we neglected to discuss the use of virtual reality for assessment and rehabilitation related to physical injuries, brain trauma and neuropsychological problems. Computer simulation and video gaming are the new tools in this field. The following describes the use of these technologies here at the beginning of the second decade of the 21st century.

Assessment of the Physical Using Virtual Reality

What is assessment using virtual reality? It is the modelling of human movement in athletic and work environments to determine ways to mitigate risk to the body of injury both short and long-term. In the image below OpenSim software accurately models human movement to help understand the interaction of our muscles and skeletal structure, improve diagnosis related to injury, and provide early intervention to eliminate a potential health problem.

Stanford University scientists have created OpenSim, a tool for modeling and simulating movement.

OpenSim was first made available as an open source platform to build, exchange and analyze musculoskeletal models and movement in the summer of 2007. Today it is widely used throughout the world by biomechanical engineers, computer scientists, robotics designers, neuroscientists, medical professionals, physical therapists, and sports physiologists. Fields of rehabilitation research include: stroke, spinal cord injury, cerebral palsy, prosthetics, orthotics, osteoarthritis and robotics-assisted therapy.

For children with cerebral palsy suffering from crouch gate, OpenSim allows doctors to assess whether a child is a suitable candidate for ham string muscle lengthening to correct this symptom that eventually can lead to permanent dependence on a wheelchair.

Using Virtual Reality to Treat Brain Trauma and Neuropsychological Problems

Virtual environment immersion is one of a number of treatment methods being deployed by medical professionals to deal with conditions like Post Traumatic Stress Disorder (PTSD).   The United States military has had to deal with PTSD, concussion, and exposure to neurotoxins in its most recent wars in Iraq and Afghanistan. In the past tests for detecting brain trauma usually involved physical examination or written responses to a questionnaire. But researchers into this field recognize that cognitive testing using computers allows care givers to identify the nuances of different types of trauma and come up with specific therapies to help patients.

Virtual Reality Immersion Technology

For veterans the video games provide multi-sensory environments similar to their wartime engagements. They do patrols on foot or in vehicles, are exposed to local native populations, some who are hostile, detonations from improvised explosive devices, shelling, bombing and grenades, and virtual gunfire. The games track neuropsychological measurements including eye movement, psychological and physiological changes, and systemic functions including cardiovascular and brain wave activity. For PTSD virtual reality provides exposure therapy, outperforming traditional exposure therapies dealing with PTSD symptoms.

Virtual reality therapy places a patient in immersive environments where phobias can be experienced. Patients become desensitized through repeated exposure in a safe, controlled atmosphere where psychologists measure physiological responses accurately and can treat any associated panic and anxiety. Virtual reality is being used to treat fear of flying, of driving, agoraphobia, social phobias and panic disorders, the trauma associated with accidents and PTSD.

The Center for BrainHealth at the University of Texas treats patients with Asperger Syndrome and Autism using virtual reality.  The virtual world contains normal, everyday life experiences and settings. Patients practice social skills as they interact in virtual restaurants, stores, work settings and home. Practicing in these settings diminishes anxiety and fear and propagates greater independence. The technology allows the physician to increase complexity in the experiences to simulate real encounters and records the physiological responses to track progress. This rewires pathways within the brain of patients allowing them to lead more normal lives.

What is the Next Step in the Evolution of These Technologies?

As effective as simulations and video games can be, the development of holographic imaging will take us into a far more immersive environment in which even those we currently cannot restore to fully active normal life will be able to experience virtual space as if it were real. The question for all of us is “will we be able to tell the difference?” In future blogs we will return to this technology’s use in biomedicine as we explore its evolution through the mid-century and beyond.

Biomedicine – Part 1: The Promise of Medical Technology in the 21st Century

Humanity is closer today to immortality than it has ever been. We have surpassed Darwinian survival of the fittest to reach a new stage in evolution, creating humans reshaped by advances in biology combined with technology. In the 21st century one of our human challenges will be – do we really want to go there? Is immortality what we seek? What are the consequences of intervening in natural processes, of manipulating the human genome, of ending aging?

In unlocking the mysteries of  the human genome we are learning to master the very essence of what makes us human, that orders and sequences our anatomy and operates our physiology, that explains why some of us are more susceptible to particular diseases than others, that tells us who is likely to get cancer,  develop early onset Alzheimer’s Disease, or pass a congenital illness to our descendants.

DNA is the programming language of life. It has been in existence on Earth for almost 4 billion years. In the last decade of the 20th century and the first of the 21st we have mapped our genome in its entirety. We in effect are in a position to replicate the genome and even make improved copies of ourselves. The science we have practiced in agriculture  we are starting to apply to ourselves.

Today genetic sequencing is the fastest growing biomedical industry on the planet. Biology has become biotechnology with ever more powerful computers central to us building new proteins, medicines, and biology. We are writing  genetic code just as if it were a programming language.  Instead of writing software we now are writing life. We have synthesized viruses and bacteria. When will we be capable of “creating” multicellular life? We are not far way. For those who profess religious faith they would say we are playing at being God.

The truth is we still do not know how life got started on Earth. So we don’t know how nature made us but we have discovered the tool kit nature uses. We have also looked into ourselves and made other discoveries. We are not a single entity.  We are in fact a combination of 10 trillion cells and a host for 100 trillion bacteria and viruses that help us thrive. In knowing our genome we have the means to engineer the 10 trillion cells. But we can also alter ourselves bacterially to improve our health and prolong life.

In 1900 we were humanity 1.0. Understanding our biology remained in its infancy. Insulin for treating diabetes was only discovered in 1921. The first natural antibiotic, penicillin, was identified in 1928. These two medical discoveries can be linked to the development of  humanity 1.1, a species interdependent with the medical support system that evolved in the 20th century.

In our mastery of the DNA toolkit what will humanity 2.0 be in 2050, and what will 3.0 be in 2100?

Here are some of the issues and technology trends that will drive biomedical innovation in the 21st century:

  1. The evolution of computational biology and the coding of organisms. Today biologists are expressing biology using mathematics and computer programs. Like their physicist brethren mathematical concepts are being used to describe biology at its most basic constructive level – cellular, molecular and genetic. What will be the outcome of this pursuit?
  2. The use of digital imaging, virtual imaging and simulation to understand human physiology better.  Creating simulation of human movement and virtual human models on computer systems will lead to better  treatment of athletes, dancers, and other humans in physically demanding occupations.
  3. Biomedical engineering of organs, skin, muscle, blood vessels, blood and bone using stem cell technologies and other bio-construction materials. This will include advances leading to the ability of the body to regenerate lost limbs.
  4. The development of human-machine interfaces leading to the creation of artificial organs and smart prostheses fully integrated to work with the natural body seamlessly. We will witness everything from implantable heart valves in infants that grow with the child, to artificial ears, eyes and limbs as the 21st century unfolds.
  5. Synthesizing of pharmaceuticals to develop one-to-one disease management. This will be particularly effective in dealing with diseases like cancer where drugs will be closely matched to the biochemistry of specific tumors.
  6. Advances in nanotechnology leading to implantable life monitors and directed treatment at disease sites within the body. Nanotechnology will make it possible to do both diagnosis and repair on an entirely new scale. With nanobots it will be possible to do on site repair of internal injuries, to deliver personalized medication to a cancerous tumor site, to remove arterial plaque, and provide mapping and imaging of internal systems at a level of detail previously unavailable.
  7. Life mapping using predictive technologies.  Today we test newborns to discover whether they carry inherited diseases. As we progress through the 21st century genomic profiling will give us the means to intervene with newborns providing  prescriptive remedies to potential futures, enhancing the human experience from in utero to end of life.
  8. Computer-aided surgery using remotely controlled robotic devices. Today we have developed less invasive procedures to replace heart valves and implant devices in the body. Robotic surgery will become routine using even more advanced imaging and data collection technologies than what we currently have on hand.
  9. Ever since a sheep named Dolly was successfully cloned, the potential to clone humans has existed. Self-duplicating humans is the stuff of science fiction and raises many ethics questions. Stem cells from cloning may prove an effective way to produce a custom human repair kit to grow replacement body parts from skin to neurons to major body organs.
  10. If we are to believe futurists such as Ray Kurzweil we will have biotechnology in place in this century to defeat aging. The implications of extended longevity for humanity are numerous.

To learn more about each of these topics revisit this blog in coming weeks as we tackle each individually. As always, questions and comments are highly welcome.

When will computers become more human? – Part 2: From Two Bits to Quantum and Neuromorphic Computing

As computers evolve in the 21st century, we will move from the silicon-based technology that dominates today to entirely new technologies. Two technologies that have seen considerable press are quantum computing and neuromorphic chips. Before we explain these two let’s review the 60 plus year evolution of the computer.

Current computers use chips made of silicon. The first computers were  mechanical devices used for astronomy, clock management, astrology and navigation. Often called analog computers they reached the height of sophistication as tools for military application in World War II. In 1944 mechanical-electronic hybrid computers appeared. But the ENIAC in 1947 represented the first truly electronic computer. ENIAC used arrays of vacuum tubes, occupied a whole building and generated a ton of heat.

The dominating logic of ENIAC and its mechanical predecessors was based on control gates. A gate was “open” or “on” or “closed” or “off.” Eventually these two states were translated into numeric values, binary digits or what we call bits. A bit could have one of two values, a 0 or a 1. Computer programs were constructed around this precise logic. A string of “0”s and “1”s in groups of 8 became binary code, the basis of modern software programs.

The adoption of silicon and the arrival of transistors in the mid 1950s  allowed the beginning of the miniaturization of the technology. Computers got smaller and used less energy because transistors didn’t need the same requirements as vacuum tubes. The development of the microchip, an integrated circuit containing transistors, resistors and capacitors  showed up in the mid 1960s.

The advent of the technology we know today, microprocessors, appeared in 1971. This was truly the dawn of the personal computer age, led by such pioneers as Gordon Moore, of Intel, who in his now often quoted “law” predicted that the number of transistors that could be placed on a microprocessor chip would double every two years. Moore’s Law has proven to be pretty accurate. The first microprocessor, the Intel 4004, contained 2,250 transistors. By the advent of the Pentium IV, in 2004, a single chip could contain 42 million transistors. Present day advanced silicon chips contain as many as a billion transistors. But the limits of silicon chip technology are being reached. As a result engineers are experimenting with new technologies including hybrid  chips that combine silicon with other materials.

At the same time as chip technology is evolving, our bit based computer programming tradition is being challenged. Instead of computer information being written in binary codes, we are seeing experiments that are pushing us closer and closer to human-like thinking.  This is where quantum computing and neuromorphic chip technology enter the picture.

Whereas binary logic is built on bits with values of “0” or “1,” quantum computing represents information as quantum bits, or qubits. A qubit can represent a “0” and “1” . This is called quantum superposition, the simultaneous representation of different informational states encompassing values of 0, 1 and all the possible values in between. Quantum computers, unlike the computers that we know and use every day, uses transmitting technologies at the atomic and subatomic level including photons, and electrons to transmit qubits. So a quantum computing platform can be very tiny and its power requirements can be miniscule.

If you haven’t had the pleasure of studying quantum physics then the following explanation may confuse you. Let me share with you a 2006 study done at the University of Illinois using a photonic quantum computer. A research team led by physicist Paul Kwiat, a John Bardeen Professor of Electrical and Computer Engineering and Physics,  presented the first demonstration of inferring information about an answer, even though the computer did not run. The researchers reported their work in the Feb. 23, 2006 issue of Nature. “It seems absolutely bizarre that counterfactual computation – using information that is counter to what must have actually happened – could find an answer without running the entire quantum computer, but the nature of quantum interrogation makes this amazing feat possible,” stated Professor Kwiat. The announcement went on to explain that quantum interrogation, sometimes called interaction-free measurement, is a technique that makes use of wave-particle duality (in this case, of photons) to search a region of space without actually entering that region of space.

Let’s move on to another new technology, neuromorphic chips. In Part 1 of this topic I described the work of Dr. Kwabena Boahen, at Stanford University.  Dr. Boahen is constructing a computer organized to emulate the organized chaos of the human brain. He has created the biology-inspired hippocampus chip. Although made of silicon, it represents the foundational technology for what will be a neural supercomputer called Neurogrid. The differences between Dr. Boahen’s silicon chip and the logical digital computer chips we use today are twofold. The neuromorphic hippocampus chip uses very little energy and it is deliberately designed not to be accurate in its calculations. Our normal computer chips make an error once per trillion calculations. The neuromorphic chip makes errors once in every 10 calculations.

This is very much in line with how our human brain works. Our 100  billion neurons fail to fire 30 to 90 percent of the time.  Researchers blame neural noise as being the major contributor to these lapses. They also see the noise as being a primary contributor to human creativity. The neuromorphic chips also operate on the power equivalent to a couple of D batteries. The reason these chips use so little power can be attributed to the way they work emulating the wiring pattern of neural circuits.  The current Neurogrid contains 45,000 silicon neuromorphic chips, each using 1/10,000 to 1/500 of the power of conventional systems. And unlike conventional computers, the chips fire in waves, creating millisecond electric spikes. To create a Neurogrid with the total brain of a mouse, Boahen will require 64 million silicon neurons. He hopes to have this machine operational by 2011 and consuming 64 watts of power to do its calculations.

What is the benefit of these neuromorphic chips? Unlike our familiar digital computer which always gives you the same answer to the same question, the neuromorphic technology uses a more fuzzy logic, looking at many different solutions, doing trial and error, finding new shortcuts, just like our human brain.

In Part 3 of this blog we will look at biological computers and the progress we are making using DNA rather than silicon as the basic construct in our pursuit of creating artificial intelligence.