Become a BCI Developer with the MindSet Development Tools

“Begin developing products and applications that interface with the brain, by using NeuroSky’s MindSet Development Tools. NeuroSky’s Brain-Computer Interface (BCI) turns the users thoughts into actions, unlocking new worlds of interactivity. By monitoring your brain waves, the headset can send messages to a device—allowing the user to control the device with their thoughts. The MindSet Development Tools (MDT) include everything necessary to develop applications for the MindSet. The NeuroSky MindSet can be paired with video games, research devices or a number of other tools for an enhanced user experience. Upon approval, applications created using the MindSet Development Tools qualify for entry into the NeuroSky Store.

* A convenient, fully documented API library and drivers for common platforms (such as Windows, Windows Mobile, and Symbian) with support across multiple program languages (such as C/C++, C#, .NET, and Java).
* All the documentation, guides, and examples necessary to develop your own BCI-enabled software or application on many popular computing platforms using the MindSet.”


“Remember the Neurosky mind-gaming headset we tried earlier this year? The one that actually worked? It’s getting a free SDK. This means both developers at large studios as well as dudes in their basement can make programs and games that do things with the data generated by the headset. And you generate data just by thinking a certain way. What kind of stuff can these developers do? Well, for larger companies they can make this an additional controller to supplement their normal games, such as reloading just by concentrating or lifting boxes and “setting crap on fire.” Independent developers can make weird one-off games that can really stretch the limits of what the Neurosky Mindset can read from your brain.”

NeuroSky’s Brainwave Device Lets Users Move Objects With Their Thoughts / July 17, 2008

Attention. Meditation. Two default human settings. You pay attention when you’re driving. You’re in a state of meditation when you watch the ocean, or perhaps a good movie. You may even meditate on purpose, sitting a pillow with closed eyes. Silicon Valley innovator NeuroSky has given new meaning to these two states of mind. Attention means you can move things back and forth. Meditation means you can levitate objects. Sound outlandish, like a lucid dream, or a hallucinogenic flashback? Actually, all it takes is a headset. NeuroSky, using a consumer-friendly EEG device, is taking Virtual Reality one step closer to just plain Real. NeuroSky has created a wearable biosensor allowing users to control electronic devices with their minds. A red wire feeds brainwaves into a patented device that turns them into commands. The device uses a series of emotion-based algorithms to allow users to push and pull objects (virtual or external) based on an attention or meditation rating. For example, users can levitate virtual objects with a strong meditation rating, or push/pull objects with a strong attention rating. Their technology allows users to control just about any mechanical system. They plan to expand into brainwave-read robots for the elderly and disabled in the future, as well as diversify their command suite to include more emotions and subtle movements such as eye blinks. Business Pundit interviewed Greg Hyver, VP of Marketing at San Jose brainwave device manufacturer, to find out more about NeuroSky’s fascinating EEG device.

Q: According to the website, NeuroSky’s goal is to “take medical sensor technology out of the hospitals, institutes and universities and put it into the hands of the average consumer in order to enhance lifestyles.” Can you talk about some progress you’ve made in that field recently?
A. Sure. The first challenge was actually non-technical in nature. We had to identify the conditions in which an “average” consumer would actually purchase a device that reads and translates their brainwaves to perform a function. NeuroSky identified five major categories that we had to address to establish our products’ design parameters: (a) price; (b) wearability; (c) ease-of-use; (d) mobility; (e) utility. It all came down to building a dry (no electrode gel), single-EEG-sensor solution that met each of our markets’ technical and costing requirements.

The medical industry already uses multiple sensor headsets and it didn’t seem to make much sense for us to produce another “medical-like” headset. You may lose a few features with a single sensor versus multiple sensors, but it’s much more intuitive for a consumer who has never used this type of technology to understand and control it more quickly and consistently. We didn’t want to frustrate the consumer by producing a multiple sensor headset that created head placement problems, lengthy calibration periods, special training sessions, non-repeatable performances and high-end (and expensive) hardware requirements that limited a platform’s (mobility) ability to support it.

By making it simple, we created a fully-embedded headset where all of the processing (reading-filtering-amplifying EEG, translating the mental states) is done on the headset, itself, without requiring a remote processing device. Since we don’t steal processor bandwidth from the remote device (for example, a simple toy product), our technology can effectively communicate with any product that can read our data stream.

Q: Who do you see as your biggest markets, considering the numerous possible applications of your technology?
A: Our early markets, what we consider the low-hanging fruit, are in toys, video games and interactive music (music controlled by mood). These are clearly entertainment-related applications where users can experience brainwave control in a casual and fun environment. This is where some of the earliest end-products from our customers should begin hitting the mass market shelves, probably in late 2008.

More advanced markets that we consider fairly massive would be the simple wellness market, especially for the baby boomer population (e.g. brain training, stress reduction, meditation), the education market (e.g. methods to improve learning in children), the cognitive disorder therapies market (e.g. ADD-ADHD, addiction, phobias, anxiety, PTSD, etc.) and the transportation market (e.g. sleep detection devices). These more advanced markets require more sophisticated applications to be built and tested which produces longer time-to-market for these products.

Q: Why the licensing model? If the unit is relatively inexpensive, why not just sell directly to consumers?
A: I’ve always liked this question because it pinpoints why I believe we will be so successful with our products. NeuroSky is a core technology provider. We license our MindKit-EM™ SDK to the developer community to enable them to easily integrate our technology into the applications they develop. Once they are ready to enter the mass market, we give them two options: (a) they may purchase our ThinkGear-EM™ modules and go off to design and build their own headset customized to their own market requirements; (b) they may purchase our off-the-shelf MindSet™ commercial headsets, place their logo on it, and bundle it with their application. Our clients then distribute and promote their products through their established channels. NeuroSky simply does not have, nor do we prefer to build, the infrastructure to reach the end-retail market, so we back-door through our OEM partners’ channels.

Q: Was your technology developed in association with any major research institutions, such as universities?
A: Yes, there were three universities professors involved in developing our technology. They were at the University of Korea (Seoul), the Moscow University (Russia) and the Virginia Polytechnic Institute (U.S.).

Q: Is there a way of confusing your device? Generating emotional uncertainty, for example?
A: EEG systems are prone to noise issues, whether on-head noise or ambient noise, creating disturbances requiring noise filtering to obtain the EEG signals. Brainwaves are very small electrical signals in the micro-volt range and certain types of noise may interfere with proper interpretation. The challenge for all EEG systems is to reduce the impact of noise during the filtering operation. Noise filtering continues to be an important technical priority in this industry. NeuroSky’s technology is not deemed “medical grade”, as we focus on consumer and not medical applications.

Q: Do you have any additional cool anecdotes about the device? For example, an amazing function it’s been used for, or a video-game trick someone accomplished?
A: Our current, single sensor technology reads and translates brainwaves on, in medical terms, the Fp1 or Fp2 locations on the forehead. There is a wealth of mental state information that can be retrieved from the forehead location, for example, a user’s attention, meditation, drowsiness, anxiety, pleasure or displeasure. Today, we have already tested and released two important mental state values: Attention & Meditation.

This continually updated mental state information is sent from the headset to an application, say a video game. The developer of that video game then decides how he or she wishes to map the attention or meditation states. In our technical “NeuroBoy World Demo” that we allow users to experience at our trade shows, and is included as part of our SDK license, our own developers have decided to map these states to certain telekinesis modes of operation. They still require the user to manually set the mode and select the object via a keyboard command, but once set, a certain mental state is mapped to operate on that mode (we view our technology as an additional complement, and not an end-all, to the current gaming experience).

For example, if the user manually sets the “lift” mode and selects an object in the room, this mode corresponds to the user’s “meditation” level at the time. By relaxing and emptying his or her mind, the meditation level grows higher and eventually the object can be raised up. Another powerful way of using these mental states is for the application to simply monitor the user’s mental state. Say, for example, you are listening to someone speak. Are you paying attention? Someone could ultimately develop an audio application that measures a person’s attention level during a played speech and identify the level of “listening performance,” then make suggestions for improvement with the mental state feedback providing the benchmark for performance measurement. The same can be applied to market research applications using EEG to rate a user’s response to an advertisement or television show during a focus group session in order to make determinations about the effectiveness of a particular media piece. The end-use of this technology is not being driven by NeuroSky, but by the imaginations of the customers that we encounter every day. And, I must say, there have been some very creative ideas that will find their way onto retail shelves in a relatively short period.”




AGES 8 and UP
Breakthrough toy can read your mind, move objects
BY Vito Pilieci  /  January 08, 2009

Giving new meaning to the phrase mind over matter, technology that gives people the ability to move objects by thinking will soon be available at North American toy stores.  Mattel Inc. has created a game that can read a child’s mind and use thoughts to manoeuvre a small foam ball through a table-top obstacle course. The Mind Flex uses technology that reads the electrical impulses (called bio-feedback) that occur within a brain while a person is thinking. A device that looks like a pair of headphones sits on the child’s head and tracks brain activity. Within the obstacle course are small fans that are activated when a child thinks. The more brain activity the child produces, the faster the fans blow. The goal is to have the child “think” the little foam ball through the obstacle course. The toy, to be officially revealed this week at the Consumer Electronics Show in Las Vegas, is expected to be in stores later this year. The Mind Flex is targeted at children eight and up and will retail for $80 U.S.

While the technology may sound straight from Star Trek, researchers have long been working on ways to use brain activity to direct machines. “It all goes back to neurofeedback that has been around for 50 years, where you can record activity coming from the human brain through the scalp,” said Melvyn Goodale, Canada Research Chair in Visual Neuroscience at the University of Western Ontario. “It has the outside look of a science-fiction theme. You are controlling things through mind waves. But things like this have been around in various science museums for some time.” Mr. Goodale says a museum in Sarasota, Florida, displays a similar toy that pits two competitors against one another. Instead of floating a ball through an obstacle course, each player tries to score a goal in a competitor’s net. The person who could create and sustain the most brain activity would power a set of fans that pushed a foam ball into the rival’s goal. Scientists are also delving into mind-over-matter technology, hoping to isolate specific brain activity with the goal of allowing people to interact with a computer or TV without a mouse, remote or a keyboard. The technology may also be used to help people who have lost their limbs control robotic prosthetics. “There are attempts to actually record activity of specific parts of the brain,” said Mr. Goodale. “To use electrodes to record the activity of groups of cells of patients with spinal cord damage to get them to control robot arms, wheel chairs or cursors on a computer screen.”

Science may be close to a breakthrough, according to Mr. Goodale. He said several research papers detail advanced ways of capturing brain activity and tests are already under way meaning the day when human and machine can communicate may not be far off. “We’ve been working for 20 or 30 years on this mind-borg or cyborg stuff. All of these things are examples of new interfaces between humans and machines,” said Steve Mann, a professor with the department of electrical and computer engineering at the University of Toronto. Mr. Mann himself has been called the world’s first “cyborg,” and is famous for having created “wearable computers” that allow him to interact with devices. He is working on technology called the EyeTap, which looks like a sleek monocle and can record what a person sees. It can also act as a display for computer-generated content. The EyeTap also responds to its environment, automatically lowering or raising lighting when a person walks into a room.

Steve Mann
email : mann [at] eecg.toronto [dot] edu

New games powered by brain waves  /  January 10th, 2009

An elderly Chinese woman wearing a headset concentrates intensely on a small foam ball and it begins to rise slowly into the air. It’s not magic, but rather the latest game from toy maker Mattel, which allows players to move a ball around an obstacle course by using just their powers of concentration. Focusing on the ball causes a fan in the base of the game — called
Mind Flex — to start up and lift the ball on a gentle stream of air. Break your concentration and the ball descends. Once a player has the ball in the air they need to try to weave it through hoops, towers and other obstacles. “It’s a mind-eye coordination game,” said Mattel’s Tim Sheridan. “As you relax you’ll find that the ball drops.” Mind Flex relies on EEG technology to measure brain wave activity through a headset equipped with sensors for the forehead and earlobes. The game, which will be available in September for 79.99 dollars, is being displayed by Mattel at the annual Consumer Electronics Show (CES) in Las Vegas.

But Mattel is not the only toy maker tapping into the power of the mind. In a report this week USA Today newspaper said game maker Uncle Milton plans to release a similar game this year. Called “Force Trainer” it is named after “The Force” powers of Yoda and Luke Skywalker in the popular Star Wars films. The game calls for players to lift a ball inside a transparent tube using their powers of concentration. “It’s been a fantasy everyone has had, using The Force,” the daily quoted Howard Roffman, president of Lucas Licensing, as saying. “Force Trainer” also uses electroencephalography, or EEG, to measure electrical activity in the brain recorded on a headset containing sensors. A company called NeuroSky adapted the EEG technology for both games, according to USA Today.



New game gizmo uses mind control
BY Asher Moses  /  May 7, 2008

An Australian company is gearing up to release a computer headset that allows people to control video games using only the power of their minds. Emotiv Systems, founded by four Australian scientists in 2003, will release the $US299 ($315) EPOC headset on the US market this year. Featuring 16 sensors that measure electrical impulses from the brain, the headset – which plugs into the PC’s USB port – will enable games to register facial expressions, emotions and even cognitive thoughts, allowing players to perform in-game actions just by visualising them.

The headset works in a similar way to voice recognition, in that it must first be calibrated using Emotiv’s software to recognise patterns in the user’s electrical brain impulses, which are used to perform 30 preset actions. When the player performs those same thoughts in the game the software knows to associate them with the correct action, such as rotate object or push object. “If you look at the way we communicate with machines up to this day, it’s always in a conscious form, so whether you turn on and off the light or you program software you always consciously tell a machine to perform a task for you,” Emotiv CEO and co-founder Nam Do said in an interview from the company’s Pyrmont offices. “But the communication among ourselves is much more interesting because we have non-conscious communications, so we read body language, we read facial expressions and we also have feelings and emotions which differentiate us from machines. Our vision for the next generation of man-machine interface is it’s not going to be limited to just conscious [interaction].”

While the headset will work in a very limited sense with existing titles, Do said the major game developers and publishers were designing a number of their upcoming titles to take full advantage of the technology. For instance, an in-game avatar would be able to mimic the human player’s facial expressions – smiles, winks, grimaces, and so on – in real time, and other non-human characters in the game could respond to these. “If you shoot somebody and you’re smiling, the non-player character can turn around and say to you, ‘What are you laughing at? You just killed that dude,’ ” Do said.

The headset could also detect the players’ emotions – whether they’re bored, angry, engaged, happy, stressed, etc – and adjust difficulty levels, in-game music and the game environment accordingly. Characters could also react to a player’s emotional cues. In horror-themed games, enemies could intelligently select the perfect time to startle a player based on how they feel, rather than having opponents in the same positions every time a mission is reloaded.

But the most powerful aspect of the EPOC is its ability to detect thoughts. Players can just think about performing actions, such as lifting or pushing objects or making them disappear, and have the game act accordingly without the need to push any keys or buttons. All of these features have been publicly demonstrated to thousands at gaming conferences using a role playing game developed by Emotiv. It will be included for free with the headset and was trialled in Sydney by

Do, who came to Australia from Vietnam in 1995 on a university scholarship, said his intention was not to replace the keyboard or traditional game controller; he simple wanted to add another layer to the experience. “You can still move around using your joystick, using your keypad, using your mouse and keyboard, just like a normal game, but there is a lot of activity that we take to another level by adding a headset – such as being able to levitate an object by thinking about it,” he said.

He said that, while the company was initially focused on gaming, the technology had applications in any situations where humans interacted with machines, such as in medicine and robotics. Further, market research companies and even Hollywood studios were tapping Emotiv’s technology to measure reactions from focus groups. Emotiv spent two years developing its technology in Sydney before moving its headquarters to San Francisco, the home of Silicon Valley, in 2005. It employs about 50 staff – neurologists, biomedical scientists, mathematicians, engineers – but its entire research team is still based in Sydney.

Moving to the US, Do said, meant Emotiv was “closer to all the action, all the [big gaming] companies, all the clients and also access to money, because, as a start-up company, money is always one of the key considerations”. He said Emotiv had been approached by numerous suitors keen to acquire the company, but wanted to first see how far the technology could grow. Emotiv has also had meetings with the major game console makers about licensing the technology to them for future products.

In addition to Do, Emotiv was founded by 1998 Young Australian of the Year Tan Le; Neil Weste, a neuroscientist who sold his chip manufacturing company Radiata Communications to Cisco in 2000 for $US295 million; and Allan Snyder, the director of the University of Sydney’s Centre for the Mind and winner of the 2001 Marconi Prize. The four founders self-funded the initial $1 million needed to start the company but have since raised $US14.5 million in series A funding. It is now in the process of raising series B funding.

Nerve-tapping neckband used in ‘telepathic’ chat
BY Tom Simonite / 12 March 2008

A neckband that translates thought into speech by picking up nerve signals has been used to demonstrate a “voiceless” phone call for the first time. With careful training a person can send nerve signals to their vocal cords without making a sound. These signals are picked up by the neckband and relayed wirelessly to a computer that converts them into words spoken by a computerised voice. Users needn’t worry about that the system voicing their inner thoughts though. Callahan says producing signals for the Audeo to decipher requires “a level above thinking”. Users must think specifically about voicing words for them to be picked up by the equipment.

The Audeo has previously been used to let people control wheelchairs using their thoughts. Watch a video demonstrating thought control of wheelchairs. “I can still talk verbally at the same time,” Callahan told New Scientist. “We can differentiate between when you want to talk silently, and when you want to talk out loud.” That could be useful in certain situations, he says, for example when making a private call while out in public. The system demonstrated at the TI conference can recognise only a limited set of about 150 words and phrases, says Callahan, who likens this to the early days of speech recognition software. At the end of the year Ambient plans to release an improved version, without a vocabulary limit. Instead of recognising whole words or phrases, it should identify the individual phonemes that make up complete words. This version will be slower, because users will need to build up what they want to say one phoneme at a time, but it will let them say whatever they want. The phoneme-based system will be aimed at people who have lost the ability to speak due to neurological diseases like ALS – also known as motor neurone disease.

Thinking of words can guide your wheelchair
BY Tom Simonite / 06 September 2007

A motorised wheelchair that moves when the operator thinks of particular words has been demonstrated by a US company. The wheelchair works by intercepting signals sent from their brain to their voice box, even when no sound is actually produced. The company behind the chair, Ambient, is developing the technology with the Rehabilitation Institute of Chicago, in the US. The wheelchair could help people with spinal injuries, or neurological problems like cerebral palsy or motor neurone disease, operate computers and other equipment despite serious problems with muscle control. The system will work providing a person can still control their larynx, or “voice box”, which may be the case even if the lack the muscle coordination necessary to produce coherent speech.

The larynx control system, called Audeo, was developed by researchers Michael Callahan and Thomas Coleman at University of Illinois at Urbana-Champaign, US, who together also founded Ambient. Restored speech The system works via a sensor-laden neckband which eavesdrops on electrical impulses sent to larynx muscles. It then relays the signals, via an encrypted wireless link, to a nearby computer. The computer decodes these signals and matches them to a series of pre-recorded “words” determined during training exercises. These “words” can then be used to direct the motorised wheelchair. Callahan and Coleman say they can also be sent to a speech synthesiser, allowing a paralysed person to “speak” out loud. Recent refinements to the algorithms used may make it possible to interpret whole sentences thought out by the user. This could potentially restore near-normal speech to people who have not spoken for years, the researchers say. “Everyone working on brain-computer interfaces wants to be able to identify words,” says Niels Birbaumer from Eberhard Karls University in Tübingen, Germany, who is developing similar systems for use by stroke victims. “If this works reliably I would be very impressed, it is very hard to record signals from nerves through the skin.”

Birbaumer adds that measuring brain waves using an electrode cap or implants placed directly in the brain has been used to control computers and wheelchairs before, but so far there is little evidence that either method can reproduce either single words or continuous speech. “Recording from outside the brain like this may the only way to do it,” he says. On the other hand, reading information directly from the brain is the only way to help people with very severe spinal injuries. “I have some patients not even able to send nerve signals to the muscles in their face,” he told New Scientist. “In those cases you have to try and interface with the brain.” Ramaswamy Palaniappan, who works on EEG-based brain computer interfaces at Essex University, agrees this is a limitation. “The main advantages of their device are that that it is very portable, not cumbersome to set-up, and the ease of use,” he told New Scientist. NASA produced a similar system to Audeo system in 2004. This can recognise a handful of short words and numbers, and the individual letters of the alphabet. The agency hopes to eventually integrate the technology into spacesuits.


“Although never released, feedback from Atari engineers and people who tested the Mindlink have commented that the time and effort put into the Mindlink system was wasted because the controllers did not perform well and gave people headaches from over concentration and constantly moving their eyebrows around to control the onscreen activities.”

Next generation of video games will be mental
by Duncan Graham-Rowe  /  13 March 2008

Two players sit across a table from one another, staring at a small white ball on a track between them. Both are wearing headbands and concentrating, trying to nudge the ball towards their opponent. All they can use is the power of thought. This is Mindball, an addictive “mind game” in which the winning strategy is to remain as focused and relaxed as possible in the heat of battle. The ball rolls away from the player with the calmest mind, as measured by sensors on their headbands. The sensors are similar to those in an electroencephalogram (EEG), which probes brain activity by detecting “brainwaves” – tiny electrical currents playing across the scalp. Because EEGs are a non-invasive and near-instantaneous way to read brain activity, they have long been touted as potentially useful in gaming. It now looks as if that promise will be fulfilled, and not just in Mindball.

Several companies are developing hardware and software which they claim can detect brainwaves and use them in video games. If all goes to plan, the first of a new generation of games with mind control as a central feature will hit the high street this year. Mind gaming has its roots in the way new games are tested. Since 2004, EmSense, a company based in Monterey, California, has been using biofeedback to help game designers evaluate new products. Testers play a game wearing EmSense’s headset, which uses an EEG to record their brainwaves, and also measures their heart rate and the sweatiness of their skin. EmSense then builds up a blow-by-blow profile of the player’s emotional state and levels of arousal during play so the game can be made more engaging.

The basic technologies inside EmSense’s headset are nothing new. Neurologists have been using the EEG as a diagnostic tool for nearly a century, while measuring heart rate with an electrocardiogram (ECG) has an even longer pedigree. Galvanic skin response (GSR), which measures emotional arousal via the conductivity of the skin – a proxy for sweatiness – has been a central element of lie detectors since the first world war. Now, though, developers are finding ways to go beyond merely improving traditional games, and incorporating biofeedback into the games themselves. One of the leaders in the field is Emotiv of San Francisco. It has developed a headset with 16 sensors that it says allows players to control aspects of a game simply by thinking about them: concentrating on an on-screen object, for example, might allow their avatar to pick it up and move it around. Similarly, NeuroSky of San Jose, California, has developed a headset which chief executive Stanley Yang says can tell whether you are focused, challenged, relaxed, afraid, anxious and so on using a single sensor held against the temple.

Developers have been trying to incorporate biofeedback into gaming for years. In 1984, Atari experimented with a headband called MindLink, which used electromyographic (EMG) sensors to detect muscle movements, allowing players to move an on-screen cursor with a frown or a raised eyebrow. In 1998, AmTex developed a game called Bio Tetris for the Nintendo 64. A heart-rate sensor clipped to the player’s ear lobe allowed players to slow the speed at which the Tetris blocks fell by remaining calm. Neither took off. So what’s different this time? One factor is the dazzling success of Nintendo’s Wii – 20 million consoles sold and rising. It dispenses with the traditional joystick and instead uses gestures to control the game via a hand-held wireless motion sensor. Its success has made it clear that people are ready for new ways to interact with games.

Another factor is that the core technology is different, though whether it works is another matter. On the surface the claims seem plausible. Neurologists have long known how to read emotional states off an EEG, and Mindball apparently picks up alpha waves – a hallmark of mental calmness. But there are also many reasons to be sceptical. Where neurologists use as many as 120 EEG sensors all over the scalp, gaming headsets have just a handful – or, in NeuroSky’s case, just one. The headsets don’t use the sticky conductive gel that medical EEGs need to transmit the signal from the scalp to the electrode. On top of that, EEGs are notoriously “noisy” – prone to interference from nearby electrical devices as well as the electrical activity produced by muscles, especially the heart. Even blinking can play havoc with an EEG signal. So how do they do it? Although the firms are cagey about how exactly their technologies work, there are a few details to go on. The number of electrodes seems to come down to a question of resolution. Medical and research-grade EEGs need to be sensitive enough to detect subtle signals amid a chorus of electrical brain activity. For gaming, the chorus itself is sufficient. “We can’t achieve the same resolution as medical EEGs, but it’s enough to detect the basic brainwaves,” Yang says.

Dealing with interference is another matter and perhaps the fledgling industry’s biggest challenge. EMG signals, produced by muscle activity, are a particular problem because they can be an order of magnitude bigger than those produced by the brain, says Desney Tan, a researcher at Microsoft in Redmond, Washington, who has worked on diagnosing cognitive states using EEGs. Yang agrees that this is a challenge, but says the trick is to develop software that can recognise and filter out unwanted signals. “Our core technology is filtering,” he says. In any case, it may not be necessary to filter out all EMG signals, says Tan. Some could be turned to the developers’ advantage, as there is a strong correspondence between involuntary facial muscle contractions and your cognitive states and emotions, he says. So EMG signals can be used to supplement the EEG. In fact, the companies already use other bio-information from ECGs and GSR. Heart rate and sweating are both good measures of how physically and mentally aroused someone is. The trick is to combine all the measurements to get an overall sense of the player’s mental and physical state. “Each of these sensors gives us a little piece of the picture,” Tan says. Using slight variations of this approach NeuroSky and its competitors claim to have cracked it.

But even if the technology works as advertised, there’s no guarantee of success in a competitive games market. How do these companies intend to succeed where others failed? One answer is by following Nintendo’s example with Wii. MindLink and Bio Tetris failed in part because they didn’t make the most of their novel interface. The games were really no different from what was already out there. “With the Wii, Nintendo did something right in designing a suite of tailored games,” Tan says. Wii games offer features that are not possible with a regular joystick, such as swinging the controller like a tennis racquet or brandishing it like a sword. The mind-game companies intend to emulate this, though without abandoning the traditional controller altogether. Their games will still be largely controlled by hand, with biofeedback offering additional features. For example, Emotiv has adapted a game based on the Harry Potter books so that players can lift boulders and throw thunderbolts just by concentrating on making it happen.

Whatever form biofeedback games take, the world is ready for them, says Kiel Gilleade a computer games researcher at Lancaster University in the UK. The current market is less interested in finding new game genres than in looking for new hardware to enhance the gaming experience, he says. Not everyone is convinced. Michael Zyda, director of the University of Southern California’s GamePipe Laboratory in Marina del Rey, says biofeedback seems to work as an evaluation tool but he believes not enough research has been done to confirm its reliability in the real world. Hans Lee, head of technology at EmSense, agrees that more work is needed. One outstanding problem, he says, is that the hardware and software don’t work for everyone. Reading emotional and cognitive states reliably is difficult because of each individual’s variation in brain activity. Even so, at least one company believes the technology is ready. Emotiv says its headsets will be on the shelves later this year, alongside a suite of biofeedback games developed by its partners. Biofeedback has been talked about in video gaming for years, but the real quest for hearts and minds begins here.

Electrical activity isn’t the only way to read a gamer’s mind
by Colin Barras

EEG is just one form of brain-computer interface that the games industry is toying with. The other is optical topography, which overcomes some of the problems with EEG. Though EEG provides lightning-quick read-outs – perfect for fast-reaction games – speed comes at the cost of resolution. This makes some experts doubt its usefulness in gaming. An EEG does little more than tell you the general area that brain activity originated in, says John-Dylan Haynes of the Bernstein Center for Computational Neuroscience in Berlin, Germany. At the other end of the spectrum is functional magnetic resonance imaging (fMRI), which provides readings accurate to down to a millimetre. For gaming, however, fMRI has crippling disadvantages: a price tag in the millions, bulky equipment and the fact that it takes at least 10 seconds to collect each data point.

Optical topography offers a compromise. It relies on the fact that oxygenated blood absorbs more infrared radiation than deoxygenated blood. When the brain is busy, the body responds by pumping oxygen to active regions. Optical topography uses near infrared spectroscopy (NIRS) to monitor changes in blood oxygenation, measuring brain activity by proxy. Technology companies are beginning to see the advantages. Hitachi, for example, showcased an optical topography headset in London last month. The headset can distinguish between an active and a resting brain, and is mobile enough to tempt game developers. “Our wireless headset can be worn without interfering with other activities,” says Atsushi Maki of the Hitachi Advanced Research Laboratory in Hatoyama, Japan. Maki thinks that the future might be a combination device. “Light doesn’t interfere with electric fields,” he says. “You could develop a headset that combines NIRS with EEG to give complementary readings.”

Brain-computer interface for Second Life  /  12 Oct 2007

While recent developments in brain-computer interface (BCI) technology have given humans the power to mentally control computers, nobody has used the technology in conjunction with the Second Life online virtual world — until now. A research team led by professor Jun’ichi Ushiba of the Keio University Biomedical Engineering Laboratory has developed a BCI system that lets the user walk an avatar through the streets of Second Life while relying solely on the power of thought. To control the avatar on screen, the user simply thinks about moving various body parts — the avatar walks forward when the user thinks about moving his/her own feet, and it turns right and left when the user imagines
moving his/her right and left arms.

The system consists of a headpiece equipped with electrodes that monitor activity in three areas of the motor cortex (the region of the brain involved in controlling the movement of the arms and legs). An EEG machine reads and graphs the data and relays it to the BCI, where a brain wave analysis algorithm interprets the user’s imagined movements. A keyboard emulator then converts this data into a signal and relays it to Second Life, causing the on-screen avatar to move. In this way, the user can exercise real-time control over the avatar in the 3D virtual world without moving a muscle. Future plans are to improve the BCI so that users can make Second Life avatars perform more complex movements and gestures. The researchers hope the mind-controlled avatar, which was created through a joint medical engineering project involving Keio’s Department of Rehabilitation Medicine and the Tsukigase Rehabilitation Center, will one day help people with serious physical impairments communicate and do business in Second Life.

“Brainloop utilizes a Brain Computer Interface (BCI) system which allows a subject to operate devices merely by imagining specific motor commands. These mentally visualized commands may be seen as the rehearsal of a motor act without the overt motor output; a neural synapse occurs but the actual movement is blocked at the corticospinal level. Motor imagery such as “move left hand”, “move right hand” or “move feet” become non-muscular communication and control signals that convey messages and commands to the external world. In Brainloop the performer is able – without physically moving – to investigate urban areas and rural landscapes as he globe-trots around virtual Google Earth.”

“This furry friend sits with you in your neurofeedback session, to keep you company, and help you out! He’s not just any bear. This bear rumbles and grumbles with your EEG. So when you do the right thing, you feel the bear rumble a little louder, and when you are not focused, and not on target, the bear gets quiet. So he’s there to help you out, and he’s really great for those who can’t see well, or who don’t like the visual displays. But he’s a friendly bear, he gets along great with everyone he meets.”

The  Measurement, Interpretation, and Use of EEG Frequency Bands
BY Thomas F. Collura  /  December 7, 1997

How brain rhythms are generated:
Populations of cells generate rhythms when they depolarize in synchrony. This activity occurs  primarily in the upper 4 layers (about 1/4 inch thick) of the outer layers of the cerebral cortex. The presence of an EEG rhythm indicates that there is some  brain activity occurring in terms of millions of cells acting together, in a synchronized fashion. The exact causes of this, and what it means for the brain and information processing, is an entire dissertation in itself. Overall, the observed brainwave frequencies must be thought of as “epiphenomena,” which are the byproduct of normal brain function, but not a  brain signal in themselves. The brain does not communicate, or do its business, using the EEG. Rather, it is a secondary measure, such as the vibration measured from an engine, or the temperature of an electronic circuit. Therefore, the brain does not, for example, produce alpha waves for any purpose. It produces them as a result of certain types of brain activity, and we can learn to  recognize them, and take advantage of them, by learning what they represent, and  what happens when we work with them.

Training of EEG rhythms
Biofeedback techniques can be used to train EEG rhythms. Training systems can use visual feedback, auditory  feedback (sounds), or use a personal trainer to provide verbal feedback, thus  making the trainee aware of which brain rhythms are present. Displays can be of many types, and computer displays are capable of producing a wide variety of useful displays. These can include “thermometers”, video games, and other  graphic displays. Systems can be set up to train to reinforce, or to reduce, any  rhythm or combination of rhythms, or for more complex situations such as training different locations to be synchronized, or desynchronized, or to train different locations to produce (or inhibit) different frequencies. We can also train more complex, derived properties, such as brainwave synchrony, coherence, or relationships  between brain rhythms recorded from different sites. This has been found particularly useful in training concentration and relaxation, for  peak-performance training, and for athletics, golfers, etc. Certain EEG  properties have been found conducive to being “in the zone,” which is a highly efficient and responsive state, useful for improving performance in many applications.

It is important to realize that, although rhythms can be trained, to produce desired results, the production (or reduction) of the specific rhythm is not an end in itself, and the change in the EEG may not  signify that the desired change has occurred. Rather, the desired brain/mind changes are a byproduct of the training, independent of changes in the EEG itself. The brain is a self-regulating system, and may behave much like a  thermostat, that tries to keep the system stable. To use an analogy, if a window is left open in a house in the winter, the house may not be cold, but the  furnace will be working hard, and the heating bills will be high. If the window  is closed, representing a return to normal operation, the temperature may not rise significantly, but the furnace will work less. Thus, the brain may achieve  a desired state, even if the measured variable, the brain rhythms, do not change  significantly, in and of themselves. Nonetheless, changes in the brain have occurred, and their benefits may be forthcoming, even in the absence of large changes in the EEG signal.

Summary of EEG Frequency Bands:
The basic EEG rhythms are  summarized briefly as follows, with regard to their typical distribution on the  scalp, subject states, tasks, physiological correlates, and the effects of training. This summary should be taken as a general roadmap, not as fixed and  hard rules.

Delta (0.1-3 Hz):
Distribution:  generally broad or diffused, may be bilateral, widespread
Subjective feeling states: deep, dreamless sleep, non-REM sleep, trance, unconscious
Associated tasks & behaviors: lethargic, not moving, not attentive
Physiological  correlates: not moving, low-level of arousal
Effects of Training: can induce drowsiness, trance, deeply relaxed states

Theta (4-7 Hz):
Distribution:  usually regional, may involve many lobes, can be lateralized or diffuse;
Subjective feeling states: intuitive, creative, recall, fantasy, imagery, creative, dreamlike, switching thoughts, drowsy; “oneness”, “knowing”
Associated tasks & behaviors: creative, intuitive; but may also  be distracted, unfocused
Physiological correlates: healing, integration of  mind/body
Effects of Training: if enhanced, can induce drifting, trancelike state if suppressed, can improve concentration, ability to focus attention

Alpha (8-12 Hz):
Distribution:  regional, usually involves entire lobe; strong occipital w/eyes closed
Subjective feeling states: relaxed, not agitated, but not drowsy; tranquil, conscious
Associated tasks & behaviors: meditation, no action
Physiological correlates: relaxed, healing
Effects of Training: can produce relaxation

Sub band low alpha: 8-10: inner-awareness of self, mind/body integration, balance
Sub band high alpha: 10-12: centering, healing,  mind/body connection

Beta (above 12 Hz)
The beta band has a  relatively large range, and has been defined as anything above the alpha band.

Low Beta (12-15 Hz);:
Distribution:  localized by side and by lobe (frontal, occipital, etc.)
Subjective feeling states: relaxed yet focused, integrated
Associated tasks & behaviors:  Typically resting yet alert when Low Beta is present.
Physiological correlates: Low Beta may be observed anywhere on the cortex. When it is recorded from the motor areas (C3, C4, Cz), it is considered to be “Sensorimotor Rhythm” or “SMR”. SMR is reduced by muscular activity, e.g. moving the arm or leg; Restraining the body
may increase SMR.
Effects of Training: increasing SMR can produce relaxed focus, improved attentive abilities, may remediate Attention Disorders.

Midrange Beta (15-18 Hz)
Distribution:  localized, over various areas. May be focused on one electrode.
Subjective feeling states: thinking, aware of self & surroundings
Associated tasks & behaviors: mental activity
Physiological correlates: alert, active, but not agitated
Effects of Training: can increase mental ability, focus, alertness, IQ

High Beta (above 18 Hz):
Distribution:  localized, may be very focused.
Subjective feeling states: alertness,  agitation
Associated tasks & behaviors: mental activity, e.g. math,  planning, etc.
Physiological correlates: general activation of mind & body functions.
Effects of Training: can induce alertness, but may also  produce agitation, etc.

Gamma (40 Hz):
Distribution: very localized
Subjective feeling states: thinking; integrated  thought
Associated tasks & behaviors: high-level information processing, “binding”
Physiological correlates: associated with information-rich task processing
Effects of Training: not known





A recent Journal of Neural Engineering article shows why progress seems to have slowed and gives an impression of the hurdles to be overcome. The basic premise of these systems are all the same: first, record the electrical brain activity using a skullcap of electrodes, then these signals are processed and fed into a genetic algorithm, which evolves to associate particular signals as instructions to move. This is then tested, with the result that only about 41 percent of attempts to move are followed, and once every 12 seconds or so, the computer decides you wanted to move when you really didn’t. To put this in perspective, imagine that you were using this interface to control an artificial knee. Six in every ten steps would result in you tripping and once every 12 seconds you would randomly kick someone (actually, I can see a benefit there). Clearly, a lot of effort is going into reducing the rate of false positives and increasing the rate of true positives.

The problem is that the skullcap records a very generalized and quite noisy signal, from which useful information must be extracted. The current approach seems to be using increasingly sophisticated filtering techniques to extract certain known signals, such as movement-related potentials, beta rhythms, and Mu rhythms. Changes to these signals are then compared and correlated to actual movement events. Using these more sophisticated methods, the true positive rate remains at about 50 percent. However, the false positive result has been substantially reduced to about 0.1 percent. Overall, this still isn’t that good because the system doesn’t pick up true positives very well. Although I was quite enthusiastic about the progress when this was first reported, it is becoming apparent that there is a lot of hard engineering work to do. The pessimist in me suggests that this approach will simply never be sensitive enough to pick up all the intended movements while still retaining a low false positive rate. This is because the skullcaps record a general, unlocalized signal from the brain and, from that, try to infer localized information.

This doesn’t mean the approach will never yield good results though. For instance, if proven safe, more invasive methods could be used to extract a more localized signal. Or perhaps neurological studies will allow researchers to use filtering techniques that result in localized information. However, pure filtering and statistical association may never yield enough specificity by itself. {Journal of Neural Engineering, 2007, DOI: 10.1088/1741-2560/5/1/002}


In the initial behavioral experiments, the researchers recorded and analyzed the output signals from the monkeys’ brains as the animals were taught to use a joystick to both position a cursor over a target on a video screen and to grasp the joystick with a specified force. After the animals’ initial training, however, the researchers made the cursor more than a simple display — now incorporating into its movement the dynamics, such as inertia and momentum, of a robot arm functioning in another room. While the animals’ performance initially declined when the robot arm was included in the feedback loop, they quickly learned to allow for these dynamics and became proficient in manipulating the robot-reflecting cursor, found the scientists. The scientists next removed the joystick, after which the monkeys continued to move their arms in mid-air to manipulate and ‘grab’ the cursor, thus controlling the robot arm. ‘The most amazing result, though, was that after only a few days of playing with the robot in this way, the monkey suddenly realized that she didn’t need to move her arm at all,’ said Nicolelis. ‘Her arm muscles went completely quiet, she kept the arm at her side and she controlled the robot arm using only her brain and visual feedback. Our analyses of the brain signals showed that the animal learned to assimilate the robot arm into her brain as if it was her own arm.’

Miguel Nicolelis
email :  nicoleli [at] neuro.duke [dot] edu…



Velliste believes that the secret to the arm’s success lay in making it as natural as possible. For a start, it could freely move its head and eyes without affecting the signals controlling the arm. It could also move the fake arm in real time. There was only about an seventh of a second worth of delay between a burst of brain activity and the corresponding movement; natural arms have similar delays between thought and deed. This nigh-instantaneous control was obvious during one trial when the animal dropped the food and immediately stopped moving the arm. This natural responsiveness made it easier for the monkeys to accept the arm as their own. They learned behaviours that had nothing to do with the task, like licking remaining food off the fingers or using the hand to push food into their mouths. They learned to move the arm in arcs to avoid knocking the food off the platform while bringing it back in a straight line. They even learned that the food (marshmallows and grape halves) stick to the fingers so while they initially opened the hand only when it was near their mouths, one of them figured out that they could open the hand well before then.

Meel Velliste
email : mev3 [at] pitt [dot] edu

Andrew Schwartz
email : abs21 [at] pitt [dot] edu


Eberhard Fetz
email : fetz [at] u.washington [dot] edu

John Donoghue
email : John_Donoghue [at] Brown [dot] edu

the LUKE ARM……


I. Human/Machine Anomalies
The most substantial portion of the PEAR experimental program examines anomalies arising in human/machine interactions. In these studies human operators attempt to bias the output of a variety of mechanical, electronic, optical, acoustical, and fluid devices to conform to pre-stated intentions, without recourse to any known physical influences. In unattended calibrations all of these sophisticated machines produce strictly random data, yet the experimental results display increases in information content that can only be attributed to the consciousness of their human operators.

Over the [Princeton] laboratory’s 27-year history, thousands of such experiments, involving many millions of trials, have been performed by several hundred operators.  The observed effects are usually quite small, of the order of a few parts in ten thousand on average, but they compound to highly significant statistical deviations from chance expectations.  These results are summarized in “Correlations of Random Binary Sequences with Pre-Stated Operator Intention” and “The PEAR Proposition.”

A number of secondary correlations reveal other anomalous structural features within these human/machine databases.  In many instances, the effects appear to be operator-specific in their details and the results of given operators on widely different machines frequently tend to be similar in character and scale.  Pairs of operators with shared intentions are found to induce further anomalies in the experimental outputs, especially when the two individuals share an emotional bond.  The data also display significant disparities between female and male operator performances, and consistent series position effects are observed in individual and collective results. These anomalies can be demonstrated with the operators located up to
thousands of miles from the laboratory, exerting their efforts many hours before or after the actual operation of the devices.

The random devices also respond to group activities of larger numbers of people, even when they are unaware of the presence of the machine. Such “FieldREG” data produced in environments fostering relatively intense or profound subjective resonance show larger deviations than those generated in more pragmatic assemblies. (See “FieldREG II: Consciousness Field Effects: Replications and Explorations.”) Venues that appear to be particularly conducive to such field anomalies include small intimate groups, group rituals, sacred sites, musical and theatrical performances, and other charismatic events.  In contrast, data generated during most academic conferences, business meetings, or other mundane venues show less deviations than would be expected by chance.

Princeton University scientists believe that the human mind can influence machines. Now, when is the last time you said something nice to your computer?
by Rogier van Bakel  /  Apr 1995

“Come on, sweetheart, you can do it. Oh, now, show me what you’re made of. Thaaat’s it!” I am alone in a room with a woman I met barely an hour ago. She is talking softly, seductively, in a voice that is both sweet and persuasive. Not to me, mind you. She is directing her words – saccharine mutterings that other people might reserve for a sick child or a particularly weak puppy – to an ugly electronic box with a red digital display.

She is Brenda Dunne, the manager of the Princeton Engineering Anomalies Research laboratory, and she is giving me a demonstration of how she might “will” a random event generator (REG) to come up with more high than low numbers. She is somehow using the power of her mind to achieve that result. And the power of her voice. She coos. She crows. She coaxes. In case you were wondering: Dunne, a developmental psychologist, is far from the mad scientist type. But she is doggedly determined to prove what most physicists have never thought possible: that the human mind can change the performance characteristics of machines. Mind over matter, as it were. Sound crazy? The work at the PEAR lab has consistently shown that “normal” volunteers – not people who purport to have any psychic powers – can indeed influence the behavior of micro-electronic equipment with their minds, with their consciousness. This is done without the benefit of electrodes and wires – and without anyone being permitted to give the machine a good whack. Nearly a hundred volunteers have conducted 212 million REG trials during the 15 years of the lab’s existence, and the research shows a tiny but statistically significant result that is not attributable to chance. The volunteers didn’t even have to sweet-talk the machine into its deviations the way Dunne has just done. Some of the “operators” merely stare broodingly at the display, focusing their minds to beat the silicon into submission. Others let their thoughts wander or read a book. Two-thirds of the volunteers have been able to affect the REG in the direction they had intended (to select more high or more low numbers), while only half of them would have produced those results by chance. A few of them have gotten results that, when expressed in a graph, are so distinct the PEAR scientists can recognize these volunteers’ patterns at a glance. Dunne refers to such patterns as “signatures.”

The effects that the volunteers accomplish are very small, but amazing. “The operators are roughly altering one bit in 1,000,” explains Michael Ibison, a British mathematical physicist who has come to work for a year at PEAR after stints at Siemens, IBM, and Agfa. “That means if you had a coin toss, psychokinesis could affect one of those coin tosses if you tossed a thousand times.” The metaphor is apt. The REG, in its simplest form, is nothing more than an electronic coin flipper. It is de-signed to come up with as many heads as tails. That is precisely what the carefully calibrated instrument does when humans leave it alone. But sit an operator in front of it, and more often than not, the REG obligingly produces slightly more heads than tails – or vice versa, depending on the operator’s intentions. If that sounds weird, consider this: you don’t have to be in the same room as the REG to get results. Or, for that matter, in the same city, state, or country. Volunteers as far away as Hungary, Kenya, Brazil, and India have shown they can influence Princeton’s REG as if they were sitting 3 feet away.

What’s love got to do with it?
Another surprising finding occurred when Dunne and her team asked couples to interact with the REG. The effects generated by two people with an emotional attachment were much larger than those produced by an “unattached” pair of operators. And the PEAR team uses other instruments to get similar, perhaps even more striking, results. There’s the Random Mechanical Cascade known as Murphy (after Murphy’s Law), a 9-foot-high vertical contraption that drops 9,000 small polystyrene balls from a spout onto a grid of 336 evenly spaced nylon pegs. The balls land in a horizontal row of 19 bins at the bottom, in a distribution pattern that looks like a bell-shaped curve. Volunteers can “think left” or “think right,” and a majority of them can cause a modest but measurable shift in where the balls land. A more recent experiment involves operators trying to control the swing of a custom-built pendulum. In another one, they’re seated in front of a computer displaying two superimposed pictures, and are told to try to suppress one and bring the other to the foreground.

Skeptics have examined the lab’s instruments, its data-processing software, its protocols. Environmental, non-consciousness-related influences such as temperature differences, passing traffic, earth tremors, and vibrations from a nearby machine shop have been ruled out as a cause for the anomalies. Other scientists have, by and large, been able to replicate PEAR’s experiments – just as PEAR’s own work builds on other academically sound research. But the Princeton University lab has amassed so much scientific evidence of the consciousness effect that, by sheer quantity of data, it has become the foremost player in this field.

If the empirical proof seems to be strong and solid, the theoretical part – how does it happen, and what does it mean? – is uncharted territory. That doesn’t stop Brenda Dunne from developing theories of her own. She points to a clipping on the wall of her chaotic office. It’s a cartoon of two scientists. One says: “I actually prayed to receive this grant money. You won’t tell anyone, will you, Charles?” After we’re done laughing, Dunne strikes an earnest note. “It’s human nature to pray, to hope, to desire. Where does this fit into a scientific world view? How can you talk about a reality that has no place for human consciousness – the very human consciousness that created that world view in the first place, the consciousness that designs the models and observes the data? Where is it in the models?”

Dunne has come to believe that human consciousness establishes a “resonance” with the physical world that can reduce some of the randomness around us. “One form of this resonance is what we know as love,” she says, referring to the experiments with the bonded-pair operators. “Do we dare theorize that love has a palpable influence on random noise? I don’t know. I would be willing to at least raise the question. This emotional bond, the ‘being on the same wavelength,’ somehow reduces the entropy in the world a little bit. And random processes seem to reflect this reduction by showing a more organized physical reality.”

Heresy versus recalcitrance It’s easy to scoff at such notions – call them scientific heresy, or New-Age drivel. And, to be sure, PEAR has a number of detractors. The editor of a prominent scientific journal once told the lab’s founder and senior scientist, Robert Jahn, that he might consider publishing Jahn’s recent paper, provided the author would transmit it telepathically. Dunne has learned to deal with the barbs, she says, “without flying off the handle, without getting angry or defensive. We welcome the criticism, and have frequently made changes at the suggestion of other scientists. It think it was Nietzsche who said: ‘Love your enemies, because they bring out the best in you.’ Unfortunately, many of our critics basically say: ‘This is the kind of nonsense I wouldn’t believe even if it were real.’ They’re people who have made up their minds that this is all hogwash, without having studied the data.”

Robert Jahn agrees. “We were not fully prepared for the degree of recalcitrance that we would encounter in otherwise learned, professional circles,” he says, somewhat testily, when I visit the lab again two months later and mention the criticism. Jahn is a thin, hollow-cheeked man in his 60s who is inseparable from his baseball cap. He holds an undergraduate degree in mechanical engineering, an MS and a PhD in physics, is a professor of aerospace sciences at Princeton University, and holds the position of Dean Emeritus of its School of Engineering and Applied Science. He started the PEAR Lab in 1979, putting his reputation, if not his career, on the line. Afraid of being lumped together with all kinds of swindlers and charlatans who have polluted the field, Jahn avoids words like “paranormal,” “psychic,” and “parapsychology” like the plague. The phenomena he looks at are “engineering anomalies,” he insists.

It was clear from the start that the subject he wanted to study was mired in controversy. Princeton administrators did not hide their concern over Jahn’s unusual career turn. They nodded their approval only after an ad hoc committee had been established to ensure Jahn’s research met scientific standards – a first at New Jersey’s famous powerhouse of learning. Although that committee has since been disbanded, a Princeton colleague, Nobel Prize-winning physicist Philip Anderson, has since attacked PEAR’s work, arguing that “if the effect of human consciousness on machines existed, hundreds of people would be beating the bank at Las Vegas.” Anderson also believes that it is not up to serious scholars to disprove Jahn’s data, smirking that this task should instead be handled by “those who are used to dealing in flim-flam, such as magicians and policemen.”

Jahn dismisses the Vegas comment as “a spurious red herring.” The effects PEAR has measured are much too small to have any usefulness at the roulette table, he counters wearily. And, yes, for the same reason, you can forget about trying to get the ATM machine to slip you an extra twenty, or about “willing” the traffic lights on Main Street to jump to green as you’re speeding along. Your odds would be astronomical.

Virtual zoo
Robert Jahn’s office is a virtual zoo of stuffed animals, mostly given to him by friends, colleagues, and students. In a corner, Jahn proudly displays his collection of small carousel horses, the fruit of a one-time subscription to the Franklin Mint’s Horse of the Month Club. The book he co-authored with Dunne, Margins of Reality (Harcourt Brace & Co., 1987), a tome on the role of consciousness in the physical world, is dedicated to the scientist’s respective pets, “and to all our other animal friends, who kept watch and understood it all.”

He is quick to explain his fascination for animals. “I believe that the capacity for the so-called anomalous interaction between consciousness and the physical environment is best utilized by other-than-human life forms. You see it in the migration capabilities of birds and fish, and in the group consciousness that is evident in swarming insects. It is a capacity of consciousness that we have largely bred out of ourselves, as humans, by our preoccupation with the development of analytical and intellectual capabilities of the mind, leaving the intuitive aspects to whither.” PEAR has never done experiments with animal consciousness. However, scattered data from researchers who have seem to support Jahn’s notions (see “Animal Magnetism: Chick It Out,” page 84).

A flight down from his office, in the engineering school’s basement, the PEAR lab is also populated by an array of teddy bears and other furry friends, most of whom cozily hang out on the velvet orange couch that looks like yesterday’s Salvation Army special. On a wall hangs an official-looking certificate from a group called The Giraffe Project, proclaiming Jahn a giraffe because he “sticks his neck out.”

But perhaps more telling are the two signs over the copying machine. One simply identifies the machine as “Baby.” That’s the name the copier was given by the lab people, who are seeking its full cooperation by using a flattering moniker. “This machine is subject to breakdowns during periods of critical need,” warns the other sign. “A special circuit called a critical detector senses the operator’s emotional state in terms of how desperate he or she is to use the machine. The detector then creates a malfunction proportional to the desperation of the operator. Threatening the machine with violence only aggravates the situation. Likewise, attempts to use another machine also may cause it to malfunction. They belong to the same union. Keep cool, say nice things to the machine. Nothing else seems to work. Never let anything mechanical know you’re in a hurry.”

It’s pretty standard office humor. Identical signs must be hanging over thousands of computers and copiers throughout the country. But nowhere is it more appropriate than here, where a team of scientists ponders if “technical” glitches might not, sometimes, be the result of operator anxiety. “It’s funny,” critiques Dunne, nodding at the placard. “And on the surface, that’s all it is. But taking that kind of an attitude toward a machine, humorous though it may be, means treating the machine as if it were alive. To a degree, we all anthropomorphize the sophisticated equipment we work with – our computers, our cars. “The way you treat a machine is going to have a great deal to do with the way it behaves. If you slam it, if you bang it, if you treat it like a thing, that reflects an attitude. If you consider the world an extension of yourself, it becomes a better place. Is that engineering? I don’t know. Probably, yes. At the very least, the equipment is simply going to last longer because you take better care of it.”

Murphy’s Law: joke or gospel?
Some people, it seems, only have to get near a computer system and it breaks down. And your hard drive, modern lore would have it, is most likely to give up the ghost when you absolutely need that report you forgot to back up last night. Is Dunne implying that Murphy’s Law is more than a whimsical piece of pseudo-science, more than a propeller head’s version of a generic urban legend? “Most people who work around technology laugh – nervously – when they hear about such phenomena,” she says, smiling. “They may dismiss it, but they know exactly what you’re talking about.”

The sentiment is echoed by Dean Radin, a researcher in Nevada who worked for Bell Labs in Columbus, Ohio, and later at Princeton. “In technical circles, Murphy’s Law is revered as the gospel; and on the other hand, the same people laugh about it. I also noticed that employees got reputations as jinxes or as people who would make systems work. And at Bell, whenever we were under the gun for an important demonstration to a VIP, the jinxes were not allowed to be present, and the people who were good were almost forced to be present. Because we figured it couldn’t hurt. And I thought this was interesting behavior for people who were otherwise highly analytical and quite bright.”

Radin, who has a master’s degree in electrical engineering and a doctorate in psychology, got his bosses to OK a study into a possible link between operator anxiety and a machine’s performance. “The reason I was able to sell this to management was that if one of the big telephone switching machines goes down for a second, we lose US$1 million in revenue,” he explains. Tenacious attempts notwithstanding, the link he sought to establish was elusive at best. “I did find, however, that there is a relationship between an operator’s intention and the performance of a machine,” Radin enthuses. He was able to replicate the random generator experiments that physicist Helmut Schmidt did at Boeing Scientific Research Labs in the ’60s. In late 1993, Radin started the privately funded Consciousness Research Laboratory through the University of Nevada. He conducts studies there that he calls both “similar” and “complementary” to PEAR’s. Is there not even a shadow of a doubt in his mind that the mind-over-matter effect really exists? “No. The criticism of this type of research by other scientists is just the usual knee-jerk reaction to unexpected data.”

A psychic garage door opener
That the effects of the human/machine resonance are tiny doesn’t mean they can’t be important in an exceedingly pragmatic way. For starters, they’re arguably greater than the effects of the flaw in Intel’s Pentium chip that caused such a brouhaha late last year. And small causes can have large consequences. In the late ’70s, Bob Jahn observed how certain aerospace technology was becoming so sensitive that it had to be protected against a passing cosmic ray. So, he told himself, it wouldn’t necessarily be a stretch to imagine some small effect that a human operator – someone who sits in front of a delicate machine, stares at it, and interacts with it for hours on end – might have on that same piece of equipment.

Jahn is not much closer to the answers now than he was 15 years ago. “We’re far short of understanding the parameters. It’s not at all clear how you shield a sensitive device to ignore the cross talk between the information processor that is our mind and the information processor that is the machine. But we know that the cross talk exists, under certain circumstances at least. And I still have the concern that it is the source of some of the gremlin effects that pilots report, and of events that occur in emergency situations where the stress among the operational crew is bound to be very high.”

PEAR may be laying the groundwork for technology that could eventually lead to more reliable vital computer systems, such as those used in air-traffic control or spacecraft. From the most pragmatic of perspectives, though, how will Dunne’s and Jahn’s notions about the properties of conscious-ness change the lives of average Janes and Joes everywhere? “Given what we see in the way of volunteers’ signature patterns, a personalized switch is a definite possibility,” Dunne muses. “It would open the garage door when you mind-beam your request at it. If you want to get really science fictiony, you might envision a car that’s been attuned to you, and when you’re very tense and nervous, a sensor in the car is able to pick this up, and won’t allow you to drive over 40 miles an hour.”

Does she believe we’ll reach a point where we can interact with our machines by the sheer force of our minds? “It’s long-term, but I won’t be surprised,” she allows. “We are moving in that direction, with technology being developed for disabled people, in which they can wire the device to brain waves or eye movements [see “Zen and the Art of Flying a Plane,” page 86]. “We already have voice-activated computers, and voice locks that recognize the voice patterns of authorized users. That’s damn close. If you have something that you might call a psychic signature, how different is that from a voice pattern? Not very.”

Dunne also sees a medical component to PEAR’s ventures into human/machine relationships. “Our own bodies are perhaps the most complex and sensitive of information processing machines. That raises the question: Is it possible that some of the processes that go on in the body – for example, random fluctuations in functions like heartbeat, immune response, neuronal connections, and the like – might be susceptible, or indeed might be designed to respond to the directives of our consciousness? There are many stories about patients who refuse to accept a diagnosis and get better in spite of every expectation. On the other hand, there are tales of patients who refuse to get better even though there is nothing terribly wrong. If we could get a handle on this thing, it might make a vast difference in how we heal ourselves. I know this is very speculative, but it’s an area that is worthy of investigation.”

Cutting off your arm
The technology we use shapes our self-image, theorizes Dunne, and therefore becomes an integral part of who we are. “When the radio was invented, our brain was likened to a box with transmitters and receivers. Now we’re into computers, and the brain has become an information processor, a complex computer. These technologies and metaphors are a reflection of our self-perception, of our own evolution; we develop them as extensions of ourselves. But how can you detach that extension from notions of self and ideas of consciousness? That’s like cutting off your arm.”

But is she serious when she recommends pampering and coochy-cooing your com-puter equipment? Why is that smile playing on her lips? “I’m half-serious,” she insists. “I make fun of it, because yes, sometimes it gets silly.” Dunne likes to tell a story about Danish physicist Niels Bohr, who supposedly had a horseshoe over his barn door. During a visit, a colleague noticed the horseshoe and inquired, “Come now Niels, you don’t believe in that nonsense, do you?” Whereupon Bohr smiled most agreeably, and replied: “Of course not, but I am told that it works whether you believe in it or not.”

Abstract : 80 groups of 7 chicks were used to test their ability to influence the trajectory of a robot bearing a candle as the unique source of light in the room. The robot is driven, via telephone line, by a random generator located 23 kilometres away. When chicks are present, the robot moves preferentially into their direction (66.25% out of 80 trials). This is significantly different from the non specific displacement of the machine in the absence of chicks and observer (p<0.00001). The random generator being the source of movements, this result suggests that chicks are able to influence it over a long distance.

Animal Magnetism
You can probably have some effect on the behavior of a sensitive random device if you really try – but chances are you’ll never be as good as a cage full of chicks. In a series of experiments carried out by René Peoc’h in collaboration with the Swiss Fondation Marcel et Monique Odier de Psycho-Physique, a small, self-propelled robot called a Tychoscope was allowed to wander around aimlessly in an enclosed room. A random generator determined the lengths of the robot’s straight-line movement and angles of rotation. Left to itself, the Tychoscope moved in entirely random patterns, and spent as much time in the left half of the room as it did in the right half.

But when a cage filled with live chicks was placed on one side of the room, the robot’s pattern changed dramatically. On average, it spent considerably more time in the area nearest the animals. It was as if the birds “willed” the robot to stay close. The chicks had two reasons for not wanting the robot to stray too far. One group had been “imprinted” (when they hatched, the first thing they saw was the Tychoscope, and they adopted it as their mother). Another group had not, but the chicks seemed to respond to the lit candle that was placed on top of the Tychoscope in the darkened room. The scientists assume from this that the chicks didn’t like the dark. By comparison, human operators who tried to “will” the robot to stay on one side of the room achieved much smaller and more erratic results.

Dean Radin, a researcher at the University of Nevada who is familiar with the Odier experiments, is not surprised by what appears to be the superior psychic aptitude of the baby chicks. “The level of motivation that was manipulated there is much higher than what is typically manipulated in a human experiment. For humans, such an experiment is relatively boring. The chicks, on the other hand, reacted as if their lives depended on it.” A second explanation could be that the birds were not hampered by rationalizations that might affect the results of a human volunteer. A person, no matter how open-minded, may subconsciously believe that the experiment is strange or silly. Says Radin: “When you have subjects who work on an instinctual level, it presumably leads to higher motivation and more remarkable results.”


Leave a Reply