By Laurel Hamers
CHICAGO — Google Maps can tell us where to find the nearest pizza joint in a strange city, but our ability to make our way through cluttered environments is still surpassed by other members of the animal kingdom.Scientists are harnessing the navigation strategies of these animals and using them to create biomimetic robots to step in where humans fall short, reported researchers on Feb. 16 at the American Association for the Advancement of Science meeting in Chicago.
For instance, the weakly electric black ghost knifefish emits electric fields that help to reveal objects in its murky underwater habitat, rather than relying on its tiny, nearly useless eyes. The current given off by the fish builds up if an object is nearby; the fish senses this variation and darts toward potential prey or away from obstacles. These electrosensing abilities — combined with a ribbon-fin tail that lets the fish swim rapidly in any direction — make the knifefish a master at detecting and catching prey.
The same technique could help scientists engineer robotic vehicles to navigate through low-visibility aquatic environments, said Malcolm MacIver of the Northwestern Neuroscience and Robotics Lab in Evanston, Illinois.
“The workhorses of underwater robotics currently are devices called remotely operated vehicles,” MacIver explained. These vessels perform important tasks ranging from conducting deep-sea research to containing oil spills. However, they are inefficient, drawing the same amount of power as ten microwave ovens and using half of that power to illuminate their path for the human operators overhead.
When MacIver's team outfitted an ROV with electrosensing capabilities to reduce its dependence on visual navigation, the robot’s electric field used less power than a 100-watt light bulb. The technology could transform underwater navigation, MacIver said, allowing ROVs to travel efficiently through even the murkiest of environments.
While adopting the conceptual principles behind animal behaviors can inspire such new machines, the relationship also works in the other direction. “If we understand the biology and anatomy of the control systems of an animal, we should be able to test them in a device and see if it works,” said neuroscientist John Hildebrand of the University of Arizona in Tucson.
Biorobotics researcher Barbara Webb of the University of Edinburgh, Scotland, is taking this approach. Her team uses robots to test their ideas about the navigation system of the desert ant, Cataglyphis velox.
Most ants leave trails of chemical signals to navigate, but the desert ant primarily uses visual cues to find food. Despite its low-resolution eyesight, it has a remarkable ability to remember a consistent route between its nest and a food source. After traveling a path just once, it will retake the same route each time.
Webb hypothesized that the ant’s pathfinding skills are based on recognition, not recall. In her scenario, an ant stores a series of images in its brain when it first traverses a route. On subsequent runs, it periodically compares the current view to the bank of stored views. If the image is familiar, the ant proceeds apace; if not, it rotates until it finds a familiar field of view.
To test this hypothesis, Webb's group constructed an ant-bot — a smartphone on wheels outfitted with a 360-degree camera — that mimics the ant’s strategy. The bot demonstrated the same navigational prowess as its insectile inspiration. “The experience of physically building the robot is quite important to understanding what the challenges are [in the animal],” Webb said.
Whether by enhancing human abilities or by testing hypotheses about animal mechanics, robots inspired by biology are blurring the boundaries between nature and technology. In doing so, their creators have built an exciting interdisciplinary field.
Laurel Hamers is a senior at Williams College majoring in biology with a minor in neuroscience. Her work has most recently appeared on Materials360Online. Read her blog at sciencescope.wordpress.com or email her at laurel.hamers@gmail.com