Efficient Extraction of Episodic Evidence from
Experimental Events of Ephemeral Existence:
Discovering the Story from Battlefield Simulations!
Bob Lucas and Dan Davis
University of Southern California
Information Sciences Institute
The School of Engineering
Marina del Rey, California
(310) 448-9449 and 448-8434
,
Abstract
The value of battlefield simulations could be best realized if the results of these exercises yielded up a cogent and coherent account of what happened, and why it happened. Current practices are not extracting as much wisdom from the large and expensive operations as could be apprehended if additional, present-day analytical techniques were to be implemented. Building on experiences with warfighters and Joint Experimentation (J9) at the Joint Forces Command, the authors lay out how the analysis of global-scale, high-resolution simulations could and should be accomplished. There is a manifest need for the description of precise outcomes improving on After Action Reporting Systems that today rely on string aggregation analytical tools and Subject Matter Experts. Proven techniques are discussed in logging, archiving, extracting and analyzing simulation data to produce the lessons that should be learned from large-scale exercises. This is followed with an examination of possible research and development paths towards story telling, employing A/I technology for finding the critical thread, and engaging JFCOM’s J9 test-bed to ensure smooth integration as well as to conduct experiments. All of this is designed to insure longer-term understanding of simulation outcomes and to parameterize these in a way that will allow for verifiable metrics of success. In an ever-accelerating threat environment, this capability will become increasingly vital.
Background and Introduction
There is a current and urgent call for the expansion of analytical capabilities in extracting important insights from battlefield simulation. (Dubik, 2003) This is reflected as well, in the general view in which simulation is held by the leadership of the Defense community. (Sega, 2003) While the venerability of many of the systems in use today speak highly of their stability and utility (Ceronowicz, 2002), the insights extracted from them has not been universally applauded (Van Riper, 2002)
It might be worth noting that this analysis of simulation data, in the search for operational insights, is also mirrored in the field of the combat historian. From Xenophon through S. L. A. Marshall, writers have struggled and agonized with the task of extracting both the story of the activity and the precious, life-saving wisdom to be gained therefrom. In each case, both in history and in simulation, the analysts continue to seek to discover novel relationships and valuable new concepts. Mere restatement of readily discerned or previously known relationships is not productive.
The U.S. Department of Defense (DoD) has for several decades conducted a series of experiments to model and simulate the complexities of the modern battlefield. In support of their mission, their analysts needed to conduct interactive experiments with entity-level simulations, using programs such as the Semi-Automated Forces (SAF) family of Intelligent Agent (IA) simulations (Ceranowicz, 2002). These needed to be done at a scale and level of resolution adequate for modeling the complexities of military operations in both traditional battlefields and in urban situations populated with civilians. All of this mandated the government analysts’ requirement of simulations of at least 1,000,000 vehicles, aircraft or other entities on a global-scale terrain database with appropriate, high-resolution insets. Experimenters using large numbers of Linux PCs distributed across a LAN found that communications limited the analysts to tens of thousands of vehicles, about two orders of magnitude fewer vehicles than their needs.
Current PC or workstation-based capabilities do not allow the analyst to conduct these experiments at the scale and level of resolution necessary. These constraints have also been observed in other varieties of simulation, which rely on equation-based simulations. (Lucas, 2003)
Current initiatives at JFCOM foresee using what has been named the Distributed Continuous Experimentation Environment (DCEE), with an eye toward twenty-four by seven operations accessible from anywhere on the Defense Research Engineering Network (DREN). This will be of great use to the Joint Urban Operations (JUO) group who is looking at assessing the needs for U.S. forces fighting in urban settings in any number of countries.
This initiative is taking a phased approach in that it will start with establishing a reconnaissance capability for the urban environment, impose a “red team” threat in the form of an interdiction mission, develop a counter response to that threat, and then initiate a strike against the bases of the “red team” actors. These phases are expected to take several months each to conceptualize, plan, and implement.
Participating in these efforts has both enabled ISI to develop the skills and collaborations necessary to effectively supply needed technology and to successfully complete proof-of-concept runs in each of the major technology areas. These include global-scale terrain, million entity operations, distributed logging, ground-truth archiving, and database management.
ISI’s team from it’s Scalable Systems Division, led by Dr. Ke-Thia Yao, has been part of the effort to enable and improve the Future After Action Reporting System (FAARS) which is the current program to analyze simulations. Dr. Yao and the rest of the ISI team have either implemented or evaluated story creation capabilities such as query tools for distributed logging, Artificial Intelligence (A/I) for story completion, and other advanced database techniques. Other staff at ISI have investigated or developed more esoteric analytical methods such as data mining, evolutionary computing, genetic algorithms, neural net training, and heuristic computing (Davis, 2004). Some of these are set forth below.
Potential Utilities for Extraction
Data Mining
FMS activities produce huge data outputs and these are frequently saved in large databases. Much of this product is intentionally created and archived for the specific purpose of the analysis to which it is later subjected. Additionally, there are similarly huge data sets which are created coincidentally to the object of later analysis, e.g. data generated by training simulations where the object is exclusively training and no significant thought was put into analyzing the data further. The extraction of useful information from these types of data sets is what is called “data mining” and that is the meaning we adopt for use in this paper. More particularly we, and others, use the term to specially imply the extraction of insights from data for which the data was not originally designed.
Data mining techniques have been shown to be illuminating in many fields, but, in the authors’ experience, they have not been enthusiastically employed by the FMS community. The efficacy of data mining has been shown in identifying new insights that would otherwise have gone unnoticed. Some authors, more functionally, have described data mining as lying at the intersection of statistics, machine learning, data management, pattern recognition, artificial intelligence and other related disciplines. The authors see it as the application of myriad techniques to accomplish its goals, but not subsuming all of these techniques into itself. Its focus on “…unsuspected relationships ..” and summarizing data in “… novel ways that are both understandable and useful …” (Hand, 2002) is the capability that is seen as most promising for FMS data analysis.
It might be worth noting that this analysis of data, in search for operational insights, is mirrored in the field of the combat historian as well. From Xenophon through S. L. A. Marshall, writers who experienced combat and those who interviewed others who had, have struggled and agonized with the task of extracting both the story of the activity and the wisdom to be gained therefrom. In each case, both in history and in simulation, the analysts seek to discover novel relationships and valuable new concepts. Mere restatement of readily discerned or previously known relationships is not productive.
The matrix of the data obtained by the FMS analyst is often quite sparse. Sometimes the analyst does have the opportunity to generate or recover some of the missing data, but often the best use of the data must be made without regard to missing data. The structure of the data is likewise often outside the purview of the analyst. Especially in the case of training data, there will be little, if any, opportunity for re-runs. Data mining techniques are still useful (perhaps most useful) in these situations. With the power of scalable parallel processor supercomputers, once data has been characterized and values ascribed to various outcomes, the recursive analysis of the data will undoubtedly find useful new views of what was critical to the outcome..
Data mining can be more generally said to require some significant effort in each of the following tasks:
- initial data analysis
- model of the data under analysis
- prediction of results/relationships
- analysis of the data sets
This capability requires mainly the appropriate application of algorithms to perform the determination of structures, the comparison of results, optimization of the search for new relationships, and handling the data in a way that is optimal for accomplishing the tasks intrinsic to data mining. The application of the broad range of known statistical techniques is central to the efficient and timely use of the information under analysis.
Evolutionary Computing
Another area of significant opportunity lies in the application of the techniques described by the Fogels in their work on Evolutionary Computation. (Fogel, 2000) Many of the new battlefield challenges represented by the relationships of the data described above are far removed from the current understanding of defense strategies. They will not be observed, presumed or described by even the most rigorous analysis of the data. Novel and asymmetric threats are continually and rapidly evolving. These new threats are being driven by groups whose one remaining effective weapon may be their innovation and the accompanying element of surprise. In this they are aided and abetted by the alienation from defense analysts in areas such as their value system, goals, training, and zeitgeist.
They replace the rule-based foundation of the Monte Carlo simulations with the concept of an entity that is able to freely roam the range of possibilities, with an appropriate feed-back loop to help in optimizing the path to the goal. Basing their work on the areas of artificial intelligence, expert systems and neural net training, evolutionary computer scientists further look to the biological paradigms popularized by Charles Darwin in his work on the evolution of animals.
Genetic Algorithms
In a variant of the work by the Fogels, David Goldberg reports significant success in applying more stringently biologic rules to his analysis (Goldberg, 2002). He sees the genetic evolutionary driver as having been tested over the millennia and therefore not likely to be deficient. His application of genetic rules is similarly successful in the test phases of his work. He feels the insights he gains are more likely to be in accord with the behaviors observed in actual life. Dr. Goldberg has used his techniques to model both organizational entities such as small populations and physical phenomena such as gas pipelines.
Monte-Carlo Analyses
Many of the simulations in use by the services today rely heavily upon Monte Carlo techniques. These simulations have a pre-established rule set and distribution or likelihood for each major activity as was described above. As noted earlier, these simulations are not deterministic and often the same basic simulation initiation program is executed several times (hundreds of runs are not uncommon.) to examine the distribution of the final outcomes, (Horne, 1999). This work is often analyzed by plotting out a series of two dimensional solution spaces on a three dimensional graph and visually identifying the optima and their relation to one another for each pair and then estimating the interrelation of the group.
Based on the work of a physicist at Caltech, the OTCI organization has developed a tool that can quantify the degree to which the input parameters affect the final outcome. This can be done in n dimensions, which would improve on the visual analytical procedure outlined above. (Gottschalk, 2004) Further, this procedure yields very interesting results with fewer runs, sometimes orders of magnitude fewer (Johnson, 1999). The technology is currently implemented for financial analyses, but could be “ported” over to battlefield simulation analyses with a high expectation of efficacy and a reasonable hope for beneficial analytical products.
Metrics for Evaluation of Success
No rational scientific process is viable without a metric for the validity of the information being produced. While military battlefield simulations are much closer to the behavioral sciences in their conduct and in their analysis, rigor is in no way diminished nor is the need for quantification reduced. The traditional method of SME evaluation is well supported by the community (Pratt, 2004).
Additionally, the not uncommon reliance on SME reviews of simulations, while effective and useful, may be missing valuable insights that would lead to new strategic advances and would prevent missing significant vulnerabilities. Not having faced the unknown enemy of the future, not knowing its mind-set, and not having the luxury of learning at a leisurely pace, the simulation community would be well-advised to take advantage of the expanded capabilities presented above in the section on advanced Data Analysis techniques.
Orderly retrieval of the information using the latest database techniques will assist the human analysts pursue their intuitive direction. The more innovative techniques representing data mining can be invoked to try to extract even more esoteric concepts and bring these to the attention of the analysts for confirmation and analysis. This gives real hope for identifying various asymmetric tactics that might not be foretold by traditional military analysis. The concepts of evolutionary computation, genetic algorithms, and Monte-Carlo sensitivity analyses should also show promise in making sure nothing is missed in the search for defense ascendancy.
Verification and validation is already a hallmark of effective military simulation. Large grids that reflect observations of simulated activities against the programmed ground truth provide a rigorous way of assessing how well the programmers did in producing the desired effect. More sophisticated techniques may be necessary to validate the outcomes derived from the story extraction and analysis.