The human factor in autonomous machines.

AuthorMagnuson, Stew
PositionTechnology Tomorrow

* Who hasn't cursed at their car when it failed to start? It wouldn't even be considered an unusual sight to see someone kicking a broken-down vehicle on the side of the road in an angry fit.

[ILLUSTRATION OMITTED]

In the military world, there are the explosive ordnance disposal technicians who name their robots and give them funerals when "they sacrifice their lives" on the battlefield.

Even if they do inspire both love and hate, these are just machines--inanimate objects with no feelings at all.

And we all know that. Humans on the other hand are emotional beings and sometimes these feelings extend to the machines in our everyday lives.

The Defense Science Board recently released a report that addressed autonomous systems in the military. One of its main conclusions was that for the technology to make inroads and be accepted, users must "trust" the machines.

Autonomy has been singled out as a key component of the third offset strategy, which intends to deliver leap-ahead battlefield technologies. There is a lot of food for thought in the report on how members of the military may interact with these autonomous systems.

"Trust is complex and multidimensional," the Defense Science Board Summer Study on Autonomy stated. "The individual making the decision to deploy a system on a given mission must trust the system."

And, as the report points out, trust is something that is earned. Those who are designing autonomous systems have to produce something that inspires confidence.

These military systems aren't conveyor belts automatically sorting out packages in a benign setting for UPS. These are machines intended for battlefields, and a failure to work as advertised could cost lives, the report said.

Trust taps into human emotions in ways that strip us down to basic instincts. An ineffective interface is one of the many potential barriers to building confidence in machines, the report said.

There is the famous case of male German BMW owners who complained about their first-generation GPS devices because they didn't want to take directions from an artificial female voice.

Yet computer-generated voices in the commercial world tend to be female and as pleasant as possible: think Siri or those frustrating automated customer service calls to a bank.

Is that the right tone for artificial intelligence in the heat of battle, or will a soldier respond better to a voice more akin to his or her drill sergeant from boot camp?

Trustworthiness also depends on...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT