In a groundbreaking project, roboticists from Carnegie Mellon University teamed up with aerospace experts from Piasecki Aircraft and Boeing to give a big copter not only eyes and ears, but also a modicum of judgment—in other words, perception, planning, and control. Their system uses light-based sensors to scan the terrain below, building up first a two-dimensional map and then a three-dimensional representation. That way, a planning module can work out a route to a landing spot near a medical case in need of vacuation, making sure that the helicopter never flies through a tree, a telephone wire, or any other impediment. Once a landing site is chosen, the system sends the copter back to the site for its landing run. The system continuously checks its plan against new data coming from the sensors and makes adjustments if necessary.
Two roboticists from the team have now set up a spinoff company, Near Earth Autonomy, that is working on a Navy program to equip many kinds of full-size helicopters with robotic systems. The goal is to enable the machines to pick up and drop off cargo and medical casualties, even if the robotic system has not had the chance to fly over the site beforehand.
It’s a step-by-step approach, one that yields benefits at every stage. Robotic aviation will thus come not in one fell swoop—it will creep up on us.