The Army Wants Autonomous Aviation Tech. But Do Pilots Trust It?

Chief Warrant Officer 3 Troy Willis performs a pre-flight inspection of a UH-60M Black Hawk, Camp Buehring, Kuwait, Feb. 27, 2018.(U.S. Army/Sgt. Thomas X. Crough)
Chief Warrant Officer 3 Troy Willis performs a pre-flight inspection of a UH-60M Black Hawk, Camp Buehring, Kuwait, Feb. 27, 2018.(U.S. Army/Sgt. Thomas X. Crough)

U.S. Army leaders are looking to autonomous technology to be the game-changer on the future battlefield, but experts are wrestling with how the service will convince aviators and leaders to trust machines to help them make life-or-death decisions in a split second.

Part of the Army's new modernization effort involves manned-unmanned teaming, a concept that will rely on unmanned, autonomous aircraft and ground vehicles working, in some cases, as forward scouts to identify and select targets much quicker than humans can.

Army leaders have stressed that there will always be a "human in the loop" to prevent misjudgements that could result in unintended casualties. But aviators and leaders are still reluctant to trust machines to think for themselves.

"Trust in autonomy is going to be a challenge as we move forward; there is a huge psychological component to it," Patrick Mason, deputy for the Army's Program Executive Office Aviation, told an audience Wednesday at the Association of the United States Army's Aviation Hot Topic event.

Col. Thomas von Eschenbach, director of the Capability Development and Integration Directorate at the Army's Aviation Center of Excellence, has been running simulations to experiment on how autonomy and artificial intelligence can make aviators more effective.

"When you add autonomy and you add AI ... you quicken the pace of decisions," von Eschenbach said. "We don't want to take things away from a human; we want to want to enable humans to be faster [and] more agile, and make the decisions inside somebody else's decision cycle.

"But when you do that, it [becomes clear] that now the burden is on the commander to make the right decision at the right time."

The challenges for commanders will become much greater when they are relying on an unmanned system operating at great distances away to accurately relay targeting information to a long-range fires system, he said.

"You no longer have the luxury of having your own sensor and your own organic platform to shoot; it's often something else sensing way outside of your ability and something shooting back here, Eschenbach said. "When we saw that, we were like 'wow, the commanders are really going to have to be on it to make a decision really fast.' It's not just machines; it's actually the commander making decisions at that speed also."

Chris Van Buiten, vice president of Innovations at Sikorsky, had a different take on the issue. He told a short story about "an interesting experience with the U.S. Air Force."

"You have the white-scarfed gang, a fighter pilot with a scarf around their neck, saying 'I'll be damned if I am giving up my scarf to a supercomputer. I don't trust it. I don't want it, and I like my job,'" he said.

This attitude, Van Buiten said, changed when the company installed an "auto-collision avoidance system in the F-16."

"There is a system in there that, when you black out in the mountain pass -- you pass out, hypoxia, whatever it is, you pull too many Gs -- you are going to come to in stabilized flight above terrain. And the computer is going to ask you 'do you feel alright to take over?'" he said. "It only took about six saves for that pilot community to go, 'oh heck yeah. I need that on my airplane.'"

Those six pilots began passing the word, he said.

"And then immediately, you started to build trust, and the question immediately pivoted to, 'what else can you do for me?'"

To Van Buiten, trust will be built over time with small advances in autonomy rather, than through a "big-bang approach."

Retired Maj. Gen. Walter Davis, vice president of Army Aviation Programs at Cypress International, agreed, but said he believes the next generation will be much more accepting of autonomy.

"It's not going to be about us; the people that are going to be using these systems are going to be far more trusting than we are of autonomy," Davis said. "They will have grown up with autonomy. They are going to see more and more autonomy. ... I think that is the way we will work through these systems, but the next generation is not going to have these problems that we have."

-- Matthew Cox can be reached at

Show Full Article