James Geary 

I, robot

Although we've always been a bionic species says James Geary, we're now blurring the line between man and machine like never before while, below, three leading scientists reveal our bionic future
  
  


Consider the inner ear. It is a flowerbed inside a blacksmith's shop, a marvel of evolutionary engineering. Down below the auditory canal - past the hammer, the anvil and the stirrup - sprout the hair cells of the cochlea, planted in tidy rows along the basilar membrane like geraniums in a window box. As the hammer and anvil pound sound waves into shape, the stirrup taps out the beat on the basilar membrane, which sets the hair cells swaying like a cornfield in a breeze. Each of the hair cells' undulations fires electrical signals to the auditory nerve and on to the auditory centre of the brain. This is how we hear.

Now consider the mobile phone. Hands-free headsets are becoming ubiquitous, and it's common to see people walking down the street with a little telecommunicating scarab attached to their ear. Cochlear implants work deeper in the auditory canal. Consisting of an external microphone and implanted electrodes, these devices bypass the damaged hair cells that cause deafness to transmit electrical impulses to the auditory nerve, which then forwards them to the brain. Auditory brainstem implants (ABIs) go even deeper. Implanted in people whose auditory nerves are severed, ABIs send sounds directly to electrodes located in the hearing centre of the brain. No biological ear required.

Each of these three devices is an example of bionics, and someone wearing a hands-free headset is just as much a cyborg as someone outfitted with an ABI. Bionics is simply the use of technology to extend, enhance or repair the human body, and we have been a bionic species ever since the first Homo sapiens used a stone to crack a nut or crush a skull. The telephone is an extension of the ear, the telescope an extension of the eye.

For thousands of years, bionics has been changing the way we experience the world, opening up our doors of perception just another crack. When combined with advances in tissue engineering, the cultivation of new organs and muscles to replace those damaged by injury or disease, it might one day blow those doors completely off their hinges.

Bionics was back in the news recently with the announcement that researchers at the University of Pittsburgh had trained two monkeys to munch marshmallows using a robotic arm controlled by their own thoughts. During voluntary physical movements, such as reaching for food, nerve cells in the brain start firing well before any movement actually takes place. It's as if the brain warms up for an impending action by directing specific clusters of neurons to fire, just as a driver warms up a car by pumping the gas pedal. The University of Pittsburgh team implanted electrodes in this area of the monkeys' brains and connected them to a computer operating the robotic limb. When the monkeys thought about reaching for a marshmallow, the mechanical arm obeyed that command. In effect, the monkeys had three arms for the duration of the experiments.

Scientists hope to use this type of brain-machine interface (BMI) to allow paralysed individuals to control prosthetic body parts. A BMI could make a detour around a damaged spinal cord, for instance, just as an ABI circumvents a severed auditory nerve. A paralysed patient with a BMI could then use their own brain signals to operate artificial limbs, wheelchairs, computers and even other electronic devices - just by thinking about it.

A BMI essentially reverses the experience of phantom limb syndrome, in which a person continues to sense a missing arm or leg long after it has been amputated or lost in an accident. Neuroscientist Miguel Nicolelis believes that people using BMIs will eventually come to regard robotic or prosthetic appendages as actual parts of their bodies, even if they are not physically attached to them.

Nicolelis has carried out experiments in which monkeys hooked up to a BMI in his lab at Duke University in North Carolina, controlled robots located in Massachusetts and Japan. The monkeys' brain signals were transmitted over the internet. This research has implications not just for prosthetics but for entertainment, too, since the technology could allow computer gamers, for instance, to remotely sense physical and virtual environments. Samuel Butler imagined something similar in his novel Erewhon, in which he described a society where machines were considered 'nothing but extra-corporeal limbs ... loose, and [lying] about detached, now here and now there, in various parts of the world'.

It's not just monkeys who use robotic limbs, though. Jesse Sullivan had both arms amputated after being severely electrocuted and was fitted with a prosthetic arm developed at the Rehabilitation Institute of Chicago's Neural Engineering Center for Artificial Limbs. US Army Sergeant Juan Arredondo lost his left hand in Iraq after an improvised explosive device ripped through his patrol vehicle. He now has an i-LIMB artificial hand, made by Scottish firm Touch Bionics. Neither of these devices is a BMI. The prosthetic is connected not to the amputee's brain but to still functioning nerves in the residual limb and muscles. Still, the experience of Sullivan and Arredondo show that hardware can be successfully grafted onto the human body.

And it's not just limbs and ears that are getting the bionic treatment, either. Several labs are developing bionic eyes that allow patients with conditions like retinitis pigmentosa - in which photoreceptor cells in the retina die off, causing blindness - to perceive basic shapes again. The system consists of a tiny video camera mounted on a pair of glasses that wirelessly transmits images to electrodes implanted in the optic nerve. Artificial muscles are also in the works, made of shape memory alloys that expand and contract in response to external stimulation.

If you prefer organic rather than metallic muscles, tissue engineering offers an alternative to the hardware of bionics. Scientists in this emerging field build new tissues and organs from scratch, using living cells, in the hope of eventually being able to transplant them into patients. Earlier this year, researchers at Massachusetts General hospital announced the construction of a bio-artificial heart. The team first stripped all the cells from a rat heart, so that only the muscular scaffolding remained. They then seeded this basic structure with neonatal cardiac cells, placed it in a bio-reactor, and within two weeks the organ was beating and conducting electrical impulses. There is still a very long way to go from the bio-artificial heart beating in a lab in Massachusetts to a similar organ actually beating in the chest of a living patient, but these preliminary results are encouraging.

The same technology is being brought to bear on other organs as well. A group at the Canadian National Research Council's Institute for Chemical Process and Environmental Technology is developing a tissue-engineered cornea to help people suffering from corneal blindness. And researchers in the Department of Physics at the University of Missouri are pioneering organ printing, in which cell structures are laid down on a sheet of nutrients just as a conventional printer prints words on a piece of paper. So far, the team has used the technique to create a network of functioning blood vessels. Eventually, print-on-demand tissues could be combined with bionics to create limbs, organs and other body parts that are part human, part machine.

Should we be encouraged or alarmed by all this? A little bit of both is always healthy, but probably more of the former than the latter. As with all technologies, tissue engineering and bionics offer tremendous benefits and tremendous risks. At least one activist group is already lobbying against bionic implants. The No Verichip Inside Movement denounces the use of the Verichip, a radio-frequency identification (RFID) tag that can be implanted just beneath the skin to, for example, track elderly patients with dementia in care homes.

Yes, the Verichip could be modified to track someone's movements and so invade their privacy. But that's already quite easy to do by simply following debit card transactions, surveillance camera movements, or mobile phone signals. What's different about bionics and tissue engineering is that the technology is moving from outside to inside our bodies. We've always used technology to modify the external world; now we're starting to use it to modify our internal world, too. This transition is sure to generate fresh moral and ethical questions as science gets further under our skin, but the issues are likely to be similar to those already raised by more familiar technologies including genetics and the internet.

Computer pioneer Norbert Wiener once advised: 'Render unto man the things which are man's and unto the computer the things which are the computer's.' Wise counsel, indeed. Trouble is, it's becoming increasingly hard to tell the difference.

· James Geary is the author of The Body Electric: an Anatomy of the New Bionic Senses as well as two books about aphorisms: The World in a Phrase: a Brief History of the Aphorism and Geary's Guide to the World's Great Aphorists.

The brain-machine interface
Professor Miguel Nicolelis

The idea behind our first brain-machine interface (BMI) was to use arrays of hundreds of electrodes to sample the activities of multiple brain cells, from all over the brain, that are involved in the generation of movement. The electrical signals from the electrodes implanted in the brain were then sent to a computer, which learned how to extract the raw information. The aim was to decode and translate into digital code the raw information that's embedded in the brain activity. The output of these models can then be used to control a variety of devices, such as robotic arms, wheelchairs or computer cursors, locally or remotely.

BMI technology is barely 10 years old, but it has evolved very quickly. My colleagues and I made the first demonstration of BMI in 1999 - a rat, using a BMI, was able to use a robotic arm to grab drops of water and move them to its mouth. In 2000 we made the first primate demonstration of that device, and in 2003 we used Rhesus monkeys. Then, in 2004, we published the first human study with an invasive (surgically implanted) BMI for Parkinson's patients, which showed that BMIs work in the same way for humans as they do for monkeys. Within these devices we experimented with closed-loop BMIs: sensors that can generate feedback, to inform the brain how that device is performing the job it's trying to accomplish.

Our latest development, which we're announcing this month, is that this sensory feedback from the device can be sent to the skin or the eye, and delivered directly into the brain. We're naming this technique the brain-machine-brain interface. Suppose you have a robotic hand that is touching an object. We have demonstrated a way to send signals from this hand directly back to the areas of the brain which process tactile information, so the feedback tricks the brain into thinking that the robotic hand is actually an extension of the body. What we are demonstrating is that the brain can incorporate these devices as they would an ordinary limb.

The monkeys we have been working with have acquired what is essentially a third arm, and we have done experiments in which a monkey at Duke University in North Carolina has operated, and received feedback from, a robot in Kyoto, Japan. It is touching and feeling things thousands of miles away; can you imagine what that experience would be like? There was a point at which the monkey was walking on a treadmill and the robot in Japan was doing exactly the same thing, but when the monkey stopped walking, the robot carried on. This means that the animal was conditioned to imagine what it had to do to move, and was able to sustain it even though it didn't need to move itself. It's a breakthrough which gives us hope that a paraplegic patient will be able to do the same thing.

I believe that we are now very close to making an attempt at something I've been working on for five years: to build an international consortium of non-profit organisations to collaborate on the Walk Again project. This is a campaign to get the best computer scientists, roboticists, neuroscientists and neurosurgeons in the world to make a paraplegic or quadriplegic patient walk again, using a combination of technologies that are emerging from the BMI field. The idea is to bypass the spinal chord (which is ruptured in paraplegic or quadriplegic patients) and, instead, use a wireless link to send a message from a microchip implanted on the surface of the brain straight to an exoskeleton - essentially a wearable robot - that will allow him or her to walk again.

This technology will allow the brain to act independently of the body. Patients will not only be able to control devices that they wear, but also operate devices that are some distance away while experiencing feedback from them. This will amount to an enormous change in the way we think about what the brain can accomplish.

I think we will be able to start large-scale clinical trials on humans next year, and treatment for paraplegic patients is expected to be available within the next five years. In a decade's time I can see us working on the restoration of language in stroke patients, prosthetic devices for vision, audio and touch, and we should be on our way to discovering methods to deal with Parkinson's and Alzheimer's with adapted BMI technology.

The body's going to be very different 100 years from now. In a century's time you could be lying on a beach on the east coast of Brazil, controlling a robotic device roving on the surface of Mars, and be subjected to both experiences simultaneously, as if you were in both places at once. The feedback from that robot billions of miles away will be perceived by the brain as if it was you up there. This technology is no longer in the realm of science fiction - it is real.

· Miguel Nicolelis is the Anne W Deane Professor of Neuroscience at Duke University, North Carolina.

The bionic eye
Robert Greenberg

Neuroprosthetics are designed to either restore function to certain parts of the body using electrical stimulation, or to treat the body through electrical stimulation. Cochlear implants have been around since the Eighties and spinal cord stimulators (which can treat chronic pain disorders) since the late Sixties so our aim 10 years ago, when we started Second Sight, was to apply these technologies to the retina. Our original concept was to build a cochlear implant for the eye. The thought was that if you could create a microphone that could pick up sound and transmit it through electrical impulses to the inner ear, perhaps you could do the same thing with vision. I did my thesis at Johns Hopkins University under a surgeon, Eugene de Juan, who was doing some early work with blind patients suffering from retinitis pigmentosa (a group of genetic eye conditions that cause the sufferer to progressively become blind over the course of their life). We tested around 25 patients with little electrical wires and they were able to see little spots of light when we electrically stimulated their retina.

After my thesis I began engineering and building one of these prosthetics to be commercially available to blind patients. In 2002, we implanted six patients with what was essentially a modified cochlear implant, with 16 electrodes, and to date these have done much better than expected.

At first they were only intended as a proof of concept - 16 electrodes will only give you an low-resolution 4x4 pixel picture - but what surprised us was that the patients, by moving their heads around to build a fuller picture, could take this crude, blurry picture and their brains would fill in the gaps to create a higher resolution picture. We hadn't anticipated this: with this incredibly basic device blind people could suddenly start achieving workable vision.

The retinal prosthesis uses a tiny video camera (the size of a pinhead) attached to a pair of glasses. We take the image from that camera, process it by breaking it up into very big pixels (the original had 16 pixels), and sending the information from each pixel to the brain through electrodes attached to the back of the retina (the retina is the thickness of two human hairs, and the consistency of one ply of wet toilet paper, so designing something that could easily attach onto it wasn't the easiest of tasks). Every point on the retina corresponds to a point in space, so, if you imagine the array of electrodes as a chequerboard on the back of the eye, we can make them see light in a certain place by electrically stimulating a certain electrode on a certain square of that chequerboard. Using this technique, in concert with the incoming video signal, you can create the perception of images. At the moment these images are in black and white, or more grey and white, but we've discovered a technique whereby colour might be achievable, and this is something we will be working on for future models. I can see colour models becoming available within five years.

When we started working in this field, we didn't know whether it would work for blind people or just the visually-impaired, but now we know definitively that blind people are able to see images with this technology. It may not work for all forms of blindness, but we know for sure that it works for patients with retinitis pigmentosa or macular degeneration, as these are the two target populations that we've been working with. With these conditions, the photoreceptors - the cells that turn light into a chemical-electrical signal to be sent to the brain - were thought to be dead. However, a study made before we began working in this field at Johns Hopkins found that the cells weren't in fact dead, just faulty, and that they could be activated with electrical impulses - this is why our patients could see spots of light. So what we did is bypass the process that normally turns light into electrical impulses, and instead input the electricity directly to the brain.

We're currently at the large-scale clinical trial stage with Second Sight, holding international trials with blind patients at Moorfields Eye Hospital in London, a hospital in Paris, another in Geneva, as well as a number around the US. The model we have now is similar to our original, with a camera based in glasses, but we have four times as many electrodes, or pixels. It's still a relatively low resolution, but we think it should be enough to allow our patients to perform useful tasks. We're hoping to be able to launch this commercially in Europe soon. How soon is soon depends on the regulatory authorities and how well this trial turns out, but I wouldn't be surprised if it were launched within the next couple of years in Europe - the US will probably take a little longer as it can be a little tougher getting approval.

We are also working on a project, funded by the Department of Energy, that is attempting to increase the resolution to several hundred pixels. I doubt in my lifetime we'll see blind patients achieving perfect sight through these means, but very soon, maybe even with the devices we're testing now, we'll achieve the ability to give the blind useful sight. I expect the same sort of developments to happen with deaf people and cochlear implants, so within the next 50 or so years deafness and blindness may well be curable. It's not hard to imagine that this technology will eventually lead to better than perfect sight or hearing - the bionic six-million-dollar-man idea - but it's still a long way off.

· Robert Greenberg is the managing director, president and chief executive of Second Sight

Tissue engineering
Professor Dame Julia Polak

Thirteen years ago I lost both of my lungs and my heart to pulmonary hypertension. I could see that while I was waiting for my transplant, people around me were dying due to a lack of donor organs, so when my ordeal was over I decided to focus my research on this problem.

I'd been working in the lung field before my transplant and, coincidentally, at Imperial College London, there was an expert working on man-made bioactive materials; so we decided to work together towards using stem-cell technology to produce artificial organs for implantation.

About ten years ago we discovered that the younger the cells are, the better they grow, and that embryonic stem cells would be brilliant for our purpose. We began linking up these cells with man-made material and placing them in animal models.

We've known the advantages of stem cells for some time - research began 50 years ago with stem cells from bone marrow. But there's still so much basic research to be done. There is also much debate around the use of embryonic stem cells, but people have developed methods to produce the same type of cell by inducing pluripotency [the ability for a stem cell to grow into any cell type] in adult stem cells, so there may be more ethical ways to get hold of the cells, but it's still early days.

The pluripotent stem cells are the principal stem cells, but the latter exist throughout the body. If you cut your skin, it will regenerate using stem cells, but as you get older you begin to slowly lose this ability. If we could find a way to stimulate this ability, degeneration could be avoided.

Tissue regeneration using the manipulation of stem cells is revolutionary, but it will take time before it can be used in clinical practice. Modern research into pluripotent stem cells only began 10 to 15 years ago. It could take as long before we are using this technology on patients. There have already been several parts of the body grown artificially, and tested on animals, including artificial skin, cartilage, certain parts of the eye, blood vessels and bladders. They function well, but it's still patchy as to which parts of the body we can replicate. The lungs - my particular field - are such complex organs that it will take much longer before transplantable organs can be grown.

· Professor Dame Julia Polak is founder of the Tissue Engineering and Regenerative Medicine Centre at Imperial College London

 

Leave a Comment

Required fields are marked *

*

*