Scientists Guide Flying Robot with Their Thoughts

Researchers Use Their Thoughts to Steer a Flying Robot

Thanks to the work of biomedical engineering professor Bin He and his team, this flying robot takes its orders from a person’s thoughts.

Using noninvasive 3-D brain-computer interfaces, scientists at the University of Minnesota have developed technology that allows them to use their thoughts to steer a flying robot.

It’s a staple of science fiction: people who can control objects with their minds.

At the University of Minnesota, a new technology is turning that fiction into reality.

In the lab of biomedical engineering professor Bin He, several young people have learned to use their thoughts to steer a flying robot around a gym, making it turn, rise, dip, and even sail through a ring.

The technology, pioneered by He, may someday allow people robbed of speech and mobility by neurodegenerative diseases to regain function by controlling artificial limbs, wheelchairs, or other devices. And it’s completely noninvasive: Brain waves (EEG) are picked up by the electrodes of an EEG cap on the scalp, not a chip implanted in the brain.

A report on the technology has been published in the Journal of Neural Engineering.

“My entire career is to push for noninvasive 3-D brain-computer interfaces, or BCI,” says He, a faculty member in the College of Science and Engineering. “[Researchers elsewhere] have used a chip implanted into the brain’s motor cortex to drive movement of a cursor [across a screen] or a robotic arm. But here we have proof that a noninvasive BCI from a scalp EEG can do as well as an invasive chip.”

Mapping the brain

He’s BCI system works thanks to the geography of the motor cortex—the area of the cerebrum that governs movement. When we move, or think about a movement, neurons in the motor cortex produce tiny electric currents. Thinking about a different movement activates a new assortment of neurons.

Sorting out these assortments laid the groundwork for the BCI, says He.

“We were the first to use both functional MRI and EEG imaging to map where in the brain neurons are activated when you imagine movements,” he says. “So now we know where the signals will come from.”

The brain map showed that imagining making fists—with one hand or the other or both—produced the most easily distinguished signals.

“This knowledge about what kinds of signals are generated by what kind of motion imagination helps us optimize the design of the system to control flying objects in real time,” He explains.

Tapping the map

Monitoring electrical activity from the brain, the 64 scalp electrodes of the EEG cap report the signals (or lack of signals) they detect to a computer, which translates the pattern into an electronic command. Volunteers first learned to use thoughts to control the 1-D movement of a cursor on a screen, then 2-D cursor movements and 3-D control of a virtual helicopter.

Now it’s the real deal, controlling an actual flying robot—formally, an AR [augmented reality] drone. He’s computers interface with the WiFi controls that come with the robot; after translating EEG brain signals into a command, the computer sends the command to the robot by WiFi.

Future directions

The journal article describes how five men and women learned to guide the flying robot. The first author is Karl LaFleur, who was a senior biomedical engineering student during the study.

“Working for Dr. He has been a phenomenal experience,” says LaFleur, who plans to put his knowledge to use when he enters the U’s Medical School next year. “He has so much experience with the scientific process, and he is excellent at helping his students learn this process while allowing them room for independent work. Being an author on a first-person journal article is a huge opportunity that most undergraduates never get.”

“I think the potential for BCI is very broad,” says He. “Next, we want to apply the flying robot technology to help disabled patients interact with the world.

“It may even help patients with conditions like stroke or Alzheimer’s disease. We’re now studying some stroke patients to see if it’ll help rewire brain circuits to bypass damaged areas.”

Reference: “Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain–computer interface” by Karl LaFleur, Kaitlin Cassady, Alexander Doud, Kaleb Shades, Eitan Rogin and Bin He, 4 June 2013, Journal of Neural Engineering.
DOI: 10.1088/1741-2560/10/4/046003

Image: University of Minnesota

1 Comment on "Scientists Guide Flying Robot with Their Thoughts"

  1. Interesting article. This technology can make a revolution in the use of air robots or copters. All copters are piloted using special remotes. Also, flight trajectory can be programming for airplane or copters (without flightless airplanes). But this technology is a perfect new way to control a robot or a copter. I think this technology will be difficult to implement. Since a person needs to clearly know the environment where gadget will flying. For properly set the flight path. This technology will be very useful for people with disabilities. Thanks for the great stuff.

Leave a comment

Email address is optional. If provided, your email will not be published or shared.