13th ICCRTS

“C2 for Complex Endeavors”

Title: “Command and Control of Unmanned Systems in Distributed ISR Operations”

Topic 2: Networks and Networking

Topic 1: C2 Concepts, Theory, and Policy

Topic 9: Collaborative Technologies for Network-Centric Operations

Authors:

Elizabeth K. Bowman, Ph.D.

Jeffrey A. Thomas, M.S.

Army Research Laboratory – Human Research and Engineering Directorate

Chris Winslow

Army Research Laboratory –Computer and Information Sciences Directorate


Command and Control of Unmanned Systems in Distributed ISR Operations

This paper describes a series of controlled experiments to investigate issues related to human-robot teaming and network centric operations. Ten Soldiers from the New Jersey Army National Guard participated in the experiments and used multiple unmanned technologies [two Packbot Small Unmanned Ground Vehicles (SUGVs), one Unmanned Air Vehicle (UAV), and a Family of six Unattended Ground Sensors (FUGS) systems] to conduct platoon level ISR activities. Experiment objectives were coordinated to address issues within and among the physical, communications, information, and human (cognitive/social) domain layers of the network. Objectives included:

·  Cognitive: Build confidence in and demonstrate a predictive performance tool for robotic operators and measure operator situational awareness and workload during missions. Investigate how to pre-process and present asset information to the end-user.

·  Social: Document the ad hoc development of social, task, and knowledge networks during missions.

·  Physical: Demonstrate an agile computing infrastructure operating over two networks, demonstrated sensor radio and software integration for common asset transport, and conducted tele-op experiments with the Packbot SUGV to compare WSRT and ARL 802.11 radios for communication distance and to qualitatively evaluate streaming video performance in both conditions.[1]

These objectives support the overall goal of the experiment; to determine the impact of unmanned technologies on human situational awareness and workload. The experiment was conducted for three weeks in July 2007. Soldiers were assigned positions in an ISR platoon to operate unmanned systems or to man vehicle-mounted FBCB2 displays to integrate information from sensor operators and generate responses to priority information requirements to a higher commander. Missions were conducted in three types of terrain; forested, open rolling sand, and urban. Various scenarios were presented each day to simulate problems likely to be encountered in current operations. Twelve actors served as the opposition force (OPFOR) and used three vehicles. The ten Blue Force Soldiers used five vehicles instrumented with an enhanced FBCB2[2]. The Soldiers and the UGV operators were positioned in such a way that they could only see OPFOR activity through one of the three types of sensors. To complicate the scenarios and avoid a ceiling effect on situational awareness results, the OPFOR was divided into groups operating in different areas of the test site. The UGV operators were also positioned apart from each other so they could only view one group of OPFOR through their sensor.

High level results are now emerging from data collected at the experiment. These results, based on preliminary analysis, can be characterized as follows.

·  Scores on the cognitive test battery appear to highly correlate with SUGV operator performance; given the ease of test administration, this tool is highly recommended for selection of operators. These results suggest that rapid decision making, time estimation, and spatial orientation are critical skills for operating unmanned systems.

·  Situational awareness results demonstrate that the use of unmanned technologies differentially contribute to warfighter understanding of enemy activities. For example, the Robotics NCO, who fused information from separate sensor systems, experienced a higher level of SA than the individual sensor operators, but also experienced the highest reported workload. Automated fusion technologies and TTPs are needed to describe how unmanned technologies are integrated within a team and how the information is pushed across the team.

·  Social, task, and knowledge networks were documented to capture key relationships and categories of information needed for decision making. These can be used to inform information protocols in future experiments and may provide initial support for new organizational structures.

These emerging and preliminary results show that many challenges exist across the layers of the network domain architecture. At the physical layer, the challenges is to develop a mobile ad hoc network (MANET) architecture that can support mobile and extended vehicle/dismount ranges in a variety of terrain conditions. At the communications layer, researchers have demonstrated that asset gateways do not function in a ‘one size fits all’ manner and need to have support from various sensor systems to store information when required. Data tagging by sensor systems also remains a problem in the effort to develop a common data format. How to pre-process and present asset information to the end-user is another continuing challenge that spans the communication and cognitive domains. At the information layer, TTPs and computer-supported fusion algorithms are needed to decrease the amount of data sent over the network and increase the targeted provision of data to individuals based on need. At the cognitive/social levels of the architecture, researchers need to understand what information Soldiers need from a network, when this information is of maximum use, and what form the information should take.

Analysis is nearly complete for data collected at this experiment. A final report will be available by November 2007.

1

[1] The Tele-op experiment was conducted in coordination with, and at the request of, the Future Combat System (FCS) PM Brigade Combat Team (BCT) Network System Integrator (NSI).

[2] The enhanced FBCB2 version was developed by CERDEC and includes the features of instant messaging, image presentation from sensors, and a display that shows the status of all network nodes.