Robotic Intelligence

From insect eyes to robot spies

By:
Ford Burkhart | Robert J. Long, NearsightGraphite illustrations | Jacob Chinn photo

Creating intelligent robots has proved far more challenging than Star Trek may have led us to believe. But the UA’s robotics team has found success by looking for inspiration from a surprising source — smart, talented insects. 

Yes, insects. Houseflies, dragonflies, even a moth, which can drive a vehicle nicknamed the Mothbot.

The UA is at the forefront of a branch of neuroscience that could soon lead to a flying robot that can smell, hear, touch, and, most important, see with the acute compound eyes of the insect world.

“If you’re looking for an insect-inspired flying device to help the United States fight a desert war, you’ve come to the right place,” says Charles Higgins, an associate professor of neuroscience and electrical engineering. He’s a pioneer in the computational architecture needed to harness living neurons, or replications of them, in robotic systems. In short, he’s one of the world’s top experts on the intelligent robot.

Higgins’ lab in the Gould-Simpson building is populated with small machines that are as talented at their specific tasks as any robot anywhere. It’s also populated with the bees, houseflies, moths, and dragonflies whose living systems provide the insights that drive the robots’ designs. 

Of particular interest to Higgins are the insects’ visual systems. The insects he works with have compound eyes, consisting of thousands of individual photoreceptor units. Among their strengths, compound eyes provide a wide view angle and allow for the detection of fast movement. 

Dragonflies, for example, are probably the world’s best visual trackers. They can spot a mosquito or other small flying insect, fly out to intercept it, and munch on it — all in the blink of an eye.

Higgins wants to tap into that target tracking information, which is available on the ventral nerve cord (similar to a spinal cord) of the dragonfly, for his robots.

In the lab, Higgins reaches into a small plastic box of houseflies, wearing a red glove attached to the entry hole. He corners one and gently takes hold of it. Each fly is precious; if one escapes, he has a large net at hand to capture it. 

Students will attach an electrode to a target area in the fly’s brain. Under Higgins’ guidance, the students learn to communicate with the fly’s brain, cell by cell. They can signal it to cause the fly to wiggle or even walk.

The compound eyes of a fly, Higgins explains, are like clusters of thousands of tiny independent eyes, each with its own lens — almost like an array of minicomputers.

Higgins’ biggest dream is to build or grow replacement parts for damaged post-stroke human brains, like neural prosthetics.

By mimicking those fly systems, Higgins has constructed robots that can process — understand, in a way — and report on what they are seeing. 

And that is precisely where the frontier exists in robotic intelligence.

Higgins’ moth-guided Mothbot proved it could turn right in response to the moth’s living eye looking to the right, an astounding feat of bioengineering. 

“This is very hard to do,” Higgins says. “This kind of brain/machine interfacing is tough. You are taking a living neural circuit, starting with a photoreceptor that is seeing photons, and converting the signal to guide the robot as it moves. The living tissue is directly driving motor output in a machine.”

Higgins, who has a Ph.D. in electrical engineering from the California Institute of Technology, performed top-secret research with military applications at Massachusetts Institute of Technology’s Lincoln Laboratory, then returned to Caltech to work on electronic versions of brain circuits. 

At the UA since 1999, Higgins also has tracked visual targets using biological and manufactured parts. Today, he is tackling his toughest project to date.

The U.S. Air Force wants a new robot that can see with night vision and sniff out explosives during battlefield reconnaissance. The old drone technology using cameras is outdated. Higgins has been tasked with building something simple that soldiers can carry — a device that is easy to make and costs only a few thousand dollars. 

“They want to be able to give every soldier a bunch of them,” Higgins says.

The Air Force-funded robot project will use an insect-inspired artificial compound eye, incorporating algorithms derived from the eye of the dragonfly and housefly. Higgins is collaborating with Stanley Pau in the UA’s College of Optical Sciences, who is designing revolutionary polarizing filters so that the artificial compound eye will see things that the human eye cannot, and with Nicholas J. Strausfeld from the Department of Neuroscience, who provided the biological basis for the robot’s algorithms. 

Higgins and his students are already driving a prototype in the halls, and he expects working devices to be delivered to the Defense Department as early as 2016. 

“My best hope is to save the lives of people in recon situations,” he says. “If a device can go over the next hill and report back information to target weapons, we could avoid sending humans where they are likely to get killed.”
Defense work aside, Higgins’ biggest dream is to build or grow replacement parts for damaged post-stroke human brains, like neural prosthetics.

“Everything I do is with the goal of understanding brains well enough to construct artificial ones,” Higgins says. One of his main interests is the hippocampus, which governs long-term memory and deteriorates in people with Alzheimer’s. 
Since insects have brain parts, called mushroom bodies, similar to the hippocampus, that leads Higgins right back to studying the fly.

“Perhaps the brain that insects have is a scaled-down version of what we have,” Higgins says. “If that’s so, we can work our way back up the tree to understand humans.”