RoboBonobo: Giving Apes Control of Their Own Robot
by Evan Ackerman / March 29, 2012
This is RoboBonobo. It’s a robotic ape. It’s got a water cannon on it, and it’ll eventually be able to chase you around under the direct control of real bonobos wielding wireless keyboards and iPads.
The bonobos at Bonobo Hope Great Ape Trust Sanctuary in Des Moines, Iowa, have gotten comfortable communicating with humans through the use of sequences of visual lexigrams. The apes can take advantage of a vocabulary of nearly 400 different words (like “hello” or “tickle” or “burrito”), and their human caretakers are looking to expand the ways in which the bonobos are able to interact with humans and the outside world. The humans have already built a prototype for a robot that the bonobos will be able to control directly, using it to “play chase games or squirt guests with an on board water gun.”
This project goes far beyond the robot, though. What Dr. Ken Schweller (a professor of computer science and psychology and chair of the Great Ape Trust) wants to do is develop a set of Internet-connected keyboards that the bonobos can carry around with them and use to communicate directly with humans. Humans, for their part, will be able to use an app that translates their speech directly to the symbols used by the bonobos, potentially opening up real-time two-way intelligent communication between you and another species.
[iframe src=”http://www.kickstarter.com/projects/1237615061/bonobo-chat-an-app-for-talking-with-apes/widget/video.html” width=”480px” height=”300px”]
RoboBonobo and Bonobo Chat are trying to raise $20,000 on Kickstarter; the funds will be used to “design, program, harden, and field-test the apps with bonobo testers and to connect them to robots and other external devices.”
[FMP width=”500″ height=”300″ autoplay=”false”] http://www.greatapetrust.org/science/history-of-ape-language/interactive-lexigram/main.swf [/FMP]
“The lexigrams, abstract symbols representing words, that are used today by the bonobos at Great Ape Trust resulted from the early work by Dr. Duane Rumbaugh and the chimpanzee Lana at the Yerkes Regional Primate Research Center in Atlanta. Creating lexigrams and connecting them to a computer revolutionized ape language research and led to the establishment of the Language Research Center at Georgia State University. This development of nonspeech communication has also assisted people with learning disabilities. Today, the lexigram panels used by the Trust bonobos represent nearly 400 words.”
Orangutans to Skype between zoos with iPads
by Sebastian Anthony / December 30, 2011
For the last six months, orangutans at Milwaukee zoo have been playing games and watching videos on iPads, but now their keepers and the charity Orangutan Outreach want to go one step further and enable ape-to-ape video chat via Skype or FaceTime.
Orangutans, like their great ape brethren (gorillas, chimpanzees, and humans), are intelligent, inquisitive creatures — and, perhaps hinting at our shared genetic ancestry, they find the shiny, bright allure of an iPad almost irresistible. So far their favorite iPad pastimes have been games like Doodle Buddy and Flick Flick Football, and watching videos. One of the orangutans, a 31-year-old called MJ, is apparently a big fan of David Attenborough’s nature documentaries. “The orangutans loved seeing videos of themselves – so there is a little vanity going on – and they like seeing videos of the orangutans who are in the other end of the enclosure,” Richard Zimmerman of Orangutan Outreach said. “So if we incorporate cameras, they can watch each other.” And thus the idea of WiFi video chat between orangutans — and eventually between zoos — was born.
Now, this might just sound like a bit of folly — Orangutan Outreach is quick to note that these iPads were not bought with public donations — but just think of the research possibilities! Just imagine, if we put iPads into the soft, leathery mitts of orangutans all around the world… would they spontaneously strike up Skype conversations without human intervention? Gorillas, chimpanzees and orangutans might not have complex speech like us, but they could still communicate via live video if the interface was simple enough (someone needs to make an app!) The great apes are our closest relatives, and everything we discover about their behavior increases our knowledge of how our own brains and physiology evolved.
Dan the baboon sits in front of a computer screen. The letters BRRU pop up. With a quick and almost dismissive tap, the monkey signals it’s not a word. Correct. Next comes, ITCS. Again, not a word. Finally KITE comes up. He pauses and hits a green oval to show it’s a word. In the space of just a few seconds, Dan has demonstrated a mastery of what some experts say is a form of pre-reading and walks away rewarded with a treat of dried wheat. Dan is part of new research that shows baboons are able to pick up the first step in reading — identifying recurring patterns and determining which four-letter combinations are words and which are just gobbledygook. The study shows that reading’s early steps are far more instinctive than scientists first thought and it also indicates that non-human primates may be smarter than we give them credit for. “They’ve got the hang of this thing,” said Jonathan Grainger, a French scientist and lead author of the research.
Baboons and other monkeys are good pattern finders and what they are doing may be what we first do in recognizing words. It’s still a far cry from real reading. They don’t understand what these words mean, and are just breaking them down into parts, said Grainger, a cognitive psychologist at the Aix-Marseille University in France. In 300,000 tests, the six baboons distinguished between real and fake words about three-out-of-four times, according to the study published in Thursday’s journal Science.
The 4-year-old Dan, the star of the bunch and about the equivalent age of a human teenager, got 80 percent of the words right and learned 308 four-letter words. The baboons are rewarded with food when they press the right spot on the screen: A blue plus sign for bogus combos or a green oval for real words. Even though the experiments were done in France, the researchers used English words because it is the language of science, Grainger said. The key is that these animals not only learned by trial and error which letter combinations were correct, but they also noticed which letters tend to go together to form real words, such as SH but not FX, said Grainger. So even when new words were sprung on them, they did a better job at figuring out which were real.
Grainger said a pre-existing capacity in the brain may allow them to recognize patterns and objects, and perhaps that’s how we humans also first learn to read. The study’s results were called “extraordinarily exciting” by another language researcher, psychology professor Stanislas Dehaene at the College of France, who wasn’t part of this study. He said Grainger’s finding makes sense. Dehaene’s earlier work says a distinct part of the brain visually recognizes the forms of words. The new work indicates this is also likely in a non-human primate. This new study also tells us a lot about our distant primate relatives. “They have shown repeatedly amazing cognitive abilities,” said study co-author Joel Fagot, a researcher at the French National Center for Scientific Research. Bill Hopkins, a professor of psychology at the Yerkes Primate Center in Atlanta, isn’t surprised. “We tend to underestimate what their capacities are,” said Hopkins, who wasn’t part of the French research team. “Non-human primates are really specialized in the visual domain and this is an example of that.”
This raises interesting questions about how the complex primate mind works without language or what we think of as language, Hopkins said. While we use language to solve problems in our heads, such as deciphering words, it seems that baboons use a “remarkably sophisticated” method to attack problems without language, he said. Key to the success of the experiment was a change in the testing technique, the researchers said. The baboons weren’t put in the computer stations and forced to take the test. Instead, they could choose when they wanted to work, going to one of the 10 computer booths at any time, even in the middle of the night. The most ambitious baboons test 3,000 times a day; the laziest only 400. The advantage of this type of experiment setup, which can be considered more humane, is that researchers get far more trials in a shorter time period, he said. “They come because they want to,” Fagot said. “What do they want? They want some food. They want to solve some task.”
Using iPads to bridge communication gap with dolphins / by Chris Foresman
Research scientist Jack Kassewitz is using iPads with custom-developed software to help facilitate two-way communication between humans and dolphins. Kassewitz has worked for years studying the behavior and communication patterns of dolphins. Numerous studies on dolphin language show signs of advanced intelligence, and it is believed that the high-frequency sounds dolphins make underwater are capable of communicating information that is holographic in nature. Since humans don’t communicate natively with holograms, Kassewitz is currently working on a project to build a symbolic language that dolphins and humans can use to communicate with one another. Kassewitz searched for nearly two years to find a touchscreen device that dolphins could reliably activate with their rostrum (or beak), while still being powerful enough to record or play back the high frequency sounds associated with dolphin language and durable enough to work in underwater environments. He had originally settled on the Panasonic Toughbook, but recently began evaluating the iPad as an alternative.
The iPad is suited to Kassewitz’s research in a number of ways. “It’s small and lightweight,” Kassewitz told Ars. “It’s very forgiving. For example, if I turn it the ‘wrong’ way, it turns itself back the ‘right’ way. And the iPhone OS system is fast—more than fast enough for my use.” Kassewitz is currently using a sealable bag that protects the iPad underwater to depths of a few feet, though he is also working with Otterbox to make something more robust and with better anti-glare capabilities to make it easier for the dolphins to see the screen. Bluetooth allows him to connect to speakers to “hear” the underwater dolphin speech, and he can view a spectrograph of the sounds on the iPad’s screen.
Kassewitz is also taking advantage of the undocumented USB audio capabilities of the iPad Camera Connection Kit to interface with some specialized audio recording equipment. He uses a series of underwater microphones (or hydrophones) to record the unique sound patterns of dolphin speech made while interacting with the iPad, to try and determine what patterns are associated with symbols displayed on the screen. “We think that once the dolphins get the hang of the touchscreen, we can let them choose from a wide assortment of symbols to represent objects, actions, and even emotions,” Kassewitz said. He believes that his team will then be able to develop a rudimentary symbolic language. “I’ve been doing this for a long time, just trying to understand dolphins as a species,” Kassewitz told Ars. “One of the things I am convinced of is that dolphins are as frustrated with us as we are with them in terms of attempting to have some kind of cross-species communication.”
The first step in building that system of communication is a very simple game wherein a dolphin named Merlin is shown an object, such as a ball or a rubber duck. (Kassewitz told us that dolphins respond well to the color yellow.) Then Merlin has to point to an image of the object on the iPad’s screen, selecting it with his rostrum. “Games are a relatively simple way to build an understanding between two animals—humans included,” Kassewitz told Ars. “Games require agreements to work, and agreements require some high-level thinking.” Ultimately, Kassewitz will build a library of symbols that dolphins can recognize that form the basis of “a complete language interface between humans and dolphins.”
Kassewitz’s research team will conduct more tests this July, pitting the Toughbook directly against the iPad to determine which platform will be used going forward. However, he believes that the iPad’s size and weight advantage may prove to be the deciding factor. “We could use two or three iPads showing different sets of images, and the dolphin would be able to choose among them,” he said.
May 23, 2010 – “Last week, a young bottlenose dolphin named Merlin became the first of his species to join the growing number of enthusiasts using the Apple iPad. Dolphin research scientist, Jack Kassewitz of SpeakDolphin.com, introduced the iPad to the dolphin in early steps towards build a language interface. “The use of the iPad is part of our continuing search to find a suitable touch screen technology which the dolphins can activate with the tip of their rostrums or beaks. After extensive searching and product review, it looks like our choice is between the Panasonic Toughbook and the Apple iPad” Kassewitz explained. “We think that once the dolphins get the hang of the touch screen, we can let them choose from a wide assortment of symbols to represent objects, actions and even emotions”.
Kassewitz explained the requirements of the technology. “Waterproofing, processor speed, touch-sensitivity, anti-glare screens, and dolphin-friendly programs are essential. As this database of dolphin symbols grows – we’ll need fast technology to help us respond appropriately and quickly to the dolphins.” The research was being conducted at Dolphin Discovery’s dolphin swim facility in Puerto Aventuras, Mexico along the picturesque coast now referred to as the Riviera Maya. The dolphin, Merlin, is a juvenile, born at the facility only two years ago. “Merlin is quite curious, like most dolphins, and he showed complete willingness to examine the iPad” said Kassewitz.
For now, the researchers are getting Merlin used to the touch screen by showing
him real objects, such as a ball, cube or plastic duck, then asking the dolphin to touch photos of those same objects on the screen. “This is an easy task for a dolphin, but it is a necessary building block towards our goal of a complete language interface between humans and dolphins”, Kassewitz said.”
email : jack [at] speakdolphin.com
SEE ALSO – CONVERSATIONAL DOLPHIN, for BEGINNERS