BRAIN IMPLANTS POWERED by SPINAL FLUID
by George Dvorsky / Jun 15, 2012
The biggest question for would-be cyborgs is: How are you going to power all those brain implants? And now it looks like some MIT engineers may have stumbled upon the answer. They have developed a fuel cell that can run on your brain’s own glucose — a breakthrough that could result in powerful neural prosthetics that could restore and control a number of bodily functions. Here’s how it would work — plus why this breakthrough could combine with two other recent developments to make a cyborg future much closer than it was before. The glucose fuel cell isn’t an entirely new idea. Back in the 1970s, scientists showed that a pacemaker could be powered using your body’s own sugar, but lithium-ion batteries proved more practical. Moreover, a glucose fuel cell requires enzymes to work, which didn’t bode well for long-term implantation in the body. To overcome this problem, a team led by Rahul Sarpeshkar at MIT developed a new kind of glucose fuel cell that is made from silicon, the same technology used to make semiconductor electronic chips. As a result, unlike the fuel cell of the 1970s, this new version has no biological components. In the old version, cellular enzymes broke down glucose to generate ATP, the cell’s energy currency. But in the updated model, it’s a platinum catalyst that strips electrons from glucose.
The result: a fuel cell that can generate up to hundreds of microwatts — enough power to fuel a neural implant. And best of all, there’s little chance for rejection or long-term diminished function. From the MIT news release:
[The researchers] calculated that in theory, the glucose fuel cell could get all the sugar it needs from the cerebrospinal fluid (CSF) that bathes the brain and protects it from banging into the skull. There are very few cells in the CSF, so it’s highly unlikely that an implant located there would provoke an immune response. There is also significant glucose in the CSF, which does not generally get used by the body. Since only a small fraction of the available power is utilized by the glucose fuel cell, the impact on the brain’s function would likely be small.
The researchers are hopeful that their fuel cells will power assistive devices, including those people with spinal-cord injuries. They admit that it may be a few years before this happens, but that the next step will be to demonstrate that it can work in a living animal. The breakthrough is part of an ongoing trend in cybernetics in which biological functions are steadily being mimicked with microelectronics, and where the body’s own natural processes are leveraged to restore function or provide energy. We recently covered the potential for chemical circuits and synthetic synapses, two ideas that are also part of bringing us closer to true cyborg status. Sarpeshkar himself, in addition to looking at glucose fuel cells, has considered the potential for advanced cochlear implants and brain-machine interfaces. His recent breakthrough with glucose fuel cells will enable his work to move forward, as he’s now discovered a way to make such devices self-powered.
A Glucose Fuel Cell for Implantable Brain–Machine Interfaces
“We have developed an implantable fuel cell that generates power through glucose oxidation, producing steady-state power and up to peak power. The fuel cell is manufactured using a novel approach, employing semiconductor fabrication techniques, and is therefore well suited for manufacture together with integrated circuits on a single silicon wafer. Thus, it can help enable implantable microelectronic systems with long-lifetime power sources that harvest energy from their surrounds. The fuel reactions are mediated by robust, solid state catalysts. Glucose is oxidized at the nanostructured surface of an activated platinum anode. Oxygen is reduced to water at the surface of a self-assembled network of single-walled carbon nanotubes, embedded in a Nafion film that forms the cathode and is exposed to the biological environment. The catalytic electrodes are separated by a Nafion membrane. The availability of fuel cell reactants, oxygen and glucose, only as a mixture in the physiologic environment, has traditionally posed a design challenge: Net current production requires oxidation and reduction to occur separately and selectively at the anode and cathode, respectively, to prevent electrochemical short circuits. Our fuel cell is configured in a half-open geometry that shields the anode while exposing the cathode, resulting in an oxygen gradient that strongly favors oxygen reduction at the cathode. Glucose reaches the shielded anode by diffusing through the nanotube mesh, which does not catalyze glucose oxidation, and the Nafion layers, which are permeable to small neutral and cationic species. We demonstrate computationally that the natural recirculation of cerebrospinal fluid around the human brain theoretically permits glucose energy harvesting at a rate on the order of at least 1 mW with no adverse physiologic effects. Low-power brain–machine interfaces can thus potentially benefit from having their implanted units powered or recharged by glucose fuel cells.”
by George Dvorsky / Jun 8, 2012
For those would-be cyborgs who are squeamish about implanting awkward computer chips into their bodies, a doctoral student at Sweden’s Linköping University may have stumbled upon a rather elegant solution. Klas Tybrandt has developed the world’s first integrated chemical circuit — a control system that channels neurotransmitters, instead of electric voltages. Tybrandt, who is studying organic electronics (which is cool in-and-of-itself), combined special transistors he developed into an integrated circuit capable of transmitting positive and negative ions as well as biomolecules. The advantage of this approach is that instead of controlling electronics, the circuits can carry chemical substances which can be configured to perform a variety of functions. The development will enable cyberneticists to control and regulate the signal paths of cells in the human body. Tybrandt’s breakthrough has created the basis for an entirely new circuit technology that’s based on ions and molecules rather than electrons and holes. According to Magnus Berggren, a professor of organic electronics, this bodes well for people whose signalling systems aren’t working properly. To overcome the impairment, the system can send out signals to muscle synapses instead. The chemical circuit can work with biological signalling substances such as acetylcholine, a neurotransmitter that works in the peripheral and nervous systems. Looking ahead, Tybrandt, along with Robert Forcheimer, a professor of information coding at LiU, is hoping to develop chemical chips that also contain logic gates, such as NAND gates that will allow for the construction of all logical functions.’
June 20, 2012
Researchers at the University of Southern California‘s Viterbi School of Engineering have developed a BioTac, a robot appendage that can outperform humans in identifying a wide range of natural materials according to their textures, paving the way for advancements in prostheses, personal assistive robots, and consumer product testing.
BioTac sensor is new type of tactile sensor built to mimic the human fingertip, using a newly designed algorithm to make decisions about how to explore the outside world by imitating human strategies. Capable of other human sensations, the sensor can also tell where and in which direction forces are applied to the fingertip and even the thermal properties of an object being touched. Like the human finger, the BioTac sensor has a soft, flexible skin over a liquid filling. The skin even has fingerprints on its surface, greatly enhancing its sensitivity to vibration. As the finger slides over a textured surface, the skin vibrates in characteristic ways. These vibrations are detected by a hydrophone inside the bone-like core of the finger. The human finger uses similar vibrations to identify textures, but the robot finger is even more sensitive.
When humans try to identify an object by touch, they use a wide range of exploratory movements based on their prior experience with similar objects. A famous theorem by 18th century mathematician Thomas Bayes describes how decisions might be made from the information obtained during these movements. USC Professor of Biomedical Engineering Gerald Loeb and recently graduated doctoral student Jeremy Fishel describes their new theorem for solving this general problem as “Bayesian Exploration.”
Built by Fishel, the specialized robot was trained on 117 common materials gathered from fabric, stationery and hardware stores. When confronted with one material at random, the robot could correctly identify the material 95% of the time, after intelligently selecting and making an average of five exploratory movements. It was only rarely confused by pairs of similar textures that human subjects making their own exploratory movements could not distinguish at all.
The researchers say this robot touch technology could be used in human prostheses or to assist companies who employ experts to assess the feel of consumer products and even human skin. Loeb and Fishel are partners in SynTouch LLC, which develops and manufactures tactile sensors for mechatronic systems that mimic the human hand. Founded in 2008 by researchers from USC’s Medical Device Development Facility, the start-up is now selling their BioTac sensors to other researchers and manufacturers of industrial robots and prosthetic hands.
Bayesian exploration for intelligent identification of textures
Jeremy A. Fishel, Gerald E. Loeb, Bayesian Exploration for Intelligent Identification of Textures, Frontiers in Neurorobotics, 2012, DOI: 10.3389/fnbot.2012.00004 (open access)
“In order to endow robots with human-like abilities to characterize and identify objects, they must be provided with tactile sensors and intelligent algorithms to select, control, and interpret data from useful exploratory movements. Humans make informed decisions on the sequence of exploratory movements that would yield the most information for the task, depending on what the object may be and prior knowledge of what to expect from possible exploratory movements. This study is focused on texture discrimination, a subset of a much larger group of exploratory movements and percepts that humans use to discriminate, characterize, and identify objects. Using a testbed equipped with a biologically inspired tactile sensor (the BioTac), we produced sliding movements similar to those that humans make when exploring textures. Measurement of tactile vibrations and reaction forces when exploring textures were used to extract measures of textural properties inspired from psychophysical literature (traction, roughness, and fineness). Different combinations of normal force and velocity were identified to be useful for each of these three properties. A total of 117 textures were explored with these three movements to create a database of prior experience to use for identifying these same textures in future encounters. When exploring a texture, the discrimination algorithm adaptively selects the optimal movement to make and property to measure based on previous experience to differentiate the texture from a set of plausible candidates, a process we call Bayesian exploration. Performance of 99.6% in correctly discriminating pairs of similar textures was found to exceed human capabilities. Absolute classification from the entire set of 117 textures generally required a small number of well-chosen exploratory movements (median = 5) and yielded a 95.4% success rate. The method of Bayesian exploration developed and tested in this paper may generalize well to other cognitive problems.”