Spatial Auditory Human-Machine Interface for UxV Teleoperation - SPATEL

Name: Spatial Auditory Human-Machine Interface for UxV Teleoperation
Acronym: SPATEL
Funding scheme: Office of Naval Research - NICOP Research Grant
Grant Agreement Number: 229553
Total budget: 129,431.88 USD
Start and end dates: 01/04/2009 - 31/03/2012
Coordinator: University of Zagreb Faculty of Electrical Engineering and Computing, LABUST
Social media: LABUST Facebook page

 

 

 


Abstract

For teloperated unmanned vehicles, mishaps tend to occur during the periods of high workload, in situations where the operator must perform complex and stressful tasks. A contemporary control room for a teleoperated UUV is stuffed with screens and information is exclusively presented visually through the visual channel. In order to address these unique human-factors problems associated with the overload of the operator visual channel, we suggested the use of auditory display as a mean to reduce visual workload, to enhance situation awareness, and mitigate the visual and cognitive demands of contemporary marine teleoperations. Auditory display presents navigation feedback data and it is especially suitable because humans use auditory modality for both the development and maintenance of situation awareness in natural environments.

1. Excellence

The research investigated the integration of the novel, robust auditory human-machine interface into the control system of an UxVs. The research objectives were to:

  • determine the mathematical foundations for guidance of the UxVs to achieve different control objectives by using Spatial Auditory HMI
  • determine the psychoacoustic method for providing operators with better-than normal localization ability
  • integrate the developed algorithms in a simulator and auditory HMI and perform evaluation experiments in real life environments

2. Impact

The impact was expected in the following focus areas:

  • Human/Unmanned Systems Collaboration - natural modes of interaction: the system provides human-machine interaction via interface which exploits hearing, one of the natural senses for building our situational awareness
  • Integrated Layered Defence Across the Entire Detect-to-Engage Continuum - a reliable 360-degree threat targeting and tracking: auditory display provides a full 360-degree field of view of objects or targets of interest
  • Manpower, Personnel, Training and Education - techniques to shorten training time, reduce training costs and maximize training impact: development of the real-time simulator ensures time and cost-efficient training of future operators, and increases the training impact.

3. Implementation - goals and achievements

During the project the following goals were achieved:

  1. the resulting kinematic controller provided a comprehensible and effective reference that enabled the operator to successfully perform path following and trajectory tracking;
  2. the calculated reference information was preconditioned prior to being presented to the operator, in order to overcome the main disadvantages of the hearing channel, hearing resolution and bias in the region of interest;
  3. enhanced audio display improved the tracking quality of path following by 35% and trajectory tracking by almost 70% for an increase in control effort of only 14%;
  4. the interface was evaluated by users and it was found to be intuitive, easy to use, and comfortable, providing very good situational awareness.