6. Relational model-base enabled real time operations

If the increased reliance on decision models and computer-based decision agents is one of the key attributes of the contemporary administrative arena, another is the burgeoning interest in moving away from traditional planning-based (anticipatory) managerial postures towards more real-time (adaptive-reactive) regimes. This movement owes much to the realization that it’s improvident to opt for a probabilistic approach to a decision situation for which a non-inferior deterministic solution is available, and to the companion realization that transitions towards real-time open up opportunities to transform previously probabilistic into deterministic decisions.

More particularly, what real-time transits offer are opportunities for administrative decision-makers to ease their dependency on projections or expectations in favor of more definite data of more recent vintage. To exploit these opportunities, administrative authorities can turn to an ever expanding array of real-time data-capturing devices. These devices, along with regular improvements in digital data communications capabilities, have shortened the intervals between the time of the first emergence of informational items and the point at which they become available as decision predicates. What’s also emerged are new-age conceptual referents, like the “dashboard” model of business administration (Eckerson, 2005), that suggest how managers should put accelerated predicates to proper use. So it is, also, that the main message of one of the most influential of modern management treatises, The Virtual Corporation (Davidow and Malone 1993), is how manufacturers might make their production decisions —and retailers their restocking decisions— less reliant on demand forecasts and more responsive to revealed demand (per purchases recorded by point-of-sale terminals).

Decision choices that wind up owing less to expectations and more to actualities have a correspondingly better chance of proving more favorable in terms of whatever criterion applies. This is the raison d’etre for essays in the direction of dynamic resource allocation. Hence the modern military’s embrace of remote sensor platforms and other facilities that can deliver real-time reconnaissance data, which allows force disposition and other combat management decisions to be based on authenticated vs. merely anticipated battlefield conditions. Applications in the public sector are also not uncommon, one of the most impressive being the United States Forest Service's elicitation of real-time fire location data from the MODIS Rapid Response system in an effort to make the best possible use of high-impact, low-availability assets like smoke jumpers and airplanes (Kaufman, et. al., 1998; Giglio, et. al., 2003).

Anyway, the focus for the remainder of this section is on how relational model-base structures can support attempts at adaptive-reactive management, and so enable more extensive essays in dynamic resource allocation.

6.1 Centripetal Data Acquisition

What’s proposed by way of a real-time information handling system is a centripetal scheme like that illustrated in Figure 1.

Figure 1: A real-time information handling protocol

Centripetal processing protocols acquire contributions from a multiplicity of monitoring/reporting reporting sources and channel them into a single data stream directed at a single relational model-base substructure. A relational model-base structure may be the province of an Administrative Agent rather that merely a disembodied mathematical function (Davidsson, et. al., 2003). It’s also possible, probable even, that the decision function/agent to which a relational model-base substructure pertains will be incorporated in a manifold network model, and so be counted as aspect of an administrative task.

An important implicit provision of the protocol shown in Figure 1 is that any data items that are received will have their significance assessed more or less immediately, and can thereafter be discarded. This pretty much obviates the need for archival databases, much less the huge data warehouses that serve as the spinal substance for the management systems now being produced under the ERP banner. For enterprises operating under a real-time regime, historical data is of substantive significance only in so far as it’s necessary to satisfy externally-imposed reporting or internal auditing requirements. Thus, enormously popular though they are these days among both administrators and academics, data mining devices would be seen as having no practical role to play in the context of real-time management systems.[1] Retrospective pattern-recognition operations would simply be deemed unnecessary, as whatever patterns might be practically pertinent to a decision situation would presumably already have been routinely recognized in the normal course of coefficient estimation exercises. This view of the fate of real-time information is certainly not unprecedented; it’s not really all that different from the approach incorporated in Complex Event Processing protocols that engine event-driven information systems (Luckham, 2002).

As for information acquisition requirements, these would expectedly first be established with respect to the factors (determinants, state-variables) over which a decision model is defined. Thereafter, items may be assigned different acquisition priorities depending on their apparent importance to the quality of decision choices, or what remains to be achieved in terms of increases in levels of precision or probity for a particular component (parametric or coefficient entry). The informational requirements specific to a relational substructure/decision model might then be denoted as {I}dx <t, which sets a maximally-acceptable time interval (t ) between the emergence of potentially interesting informational items and their presentation as decision inputs. This phrasing is consistent with the concept of effective vs. strict real-time. Real-time, in its strict interpretation, requires that information always (unconditionally) be delivered as immediately as possible. Effective real-time, on the other hand, treats timeliness as relativistic, requiring only that information be delivered prior to the point where it's required to inform a decision.

Taking steps to reduce the input burdens on real-time systems will make it easier to meet whatever timeliness requirements are in place. One way to ease data handling demands is to try to increase levels of information compression. This, it may be clear, will occur as a natural consequence of any exchange of conventional relational database for relational model-base structures. Consider, for example, that the informational value of a coefficient sitting in a cell of a relational model-base substructure would be no less than the sum of the multitude of raw data items from which it was derived. The happy consequence of higher levels of information compression is a corresponding reduction in the volume of inputs needed to drive decision functions under relational model-base vs. database management conventions. Loadings on communication facilities used to support intra- or inter-organizational information would also be diminished to the extent that exchanges consist of compressed parameter or coefficient passing rather than transfers of raw (unconditioned, undigested) data items. A second tactic for moderating real-time data capturing and communication demands is Redundancy Filtering. Equipping real-time systems with redundancy filtering facilities is done in response to the assertion that information is of practicable value only to the extent that it's indicative of a change of some sort (Dawkins, 1998). Redundancy filtering is thus aimed at the systematic recognition and elimination of all items that are coincidental with expectations (e.g., in a true real-time Air Traffic Control System, the only aircraft of which human flight controllers would be aware are those that have departed in some way from previously-filed flight plans).

Another feature required in real-time contexts is some sort of Input Fusion facility. Such would be needed whenever decision predicates might possibly be of different orders and/or origins. Trying to integrate information of different orders ¾quantitative and categorical, apodictic and anecdotal, logical and axiological, textual and visual, Elint (electronic intelligence) and Humint (human-originated intelligence)¾ is a daunting challenge, both conceptually and mechanically. It’s presumed, however, that this challenge will be kept reasonably well contained in the context of relational model-base structures because of the previously noted restriction to regular decision situations. This means that the only inputs that need to be accommodated will all be neatly numerical, such that input fusion may involve nothing more than elementary arithmetic conditioning initiatives like standardization, normalization, compilation (development of simple statistical digests like means and modes), minor mathematical as is done in GPS (Global Positioning System) based navigational systems or the two to three dimensional conversions that produce topographical charts from aerial photographs.

Inputs of different origins means something more than merely inputs from multiple sources. It means information collected on different dimensions, and so of disparate provenance rather than just disparate formatting. Contemporary climate control systems now routinely employ a simple fusion function to generate a composite variable, humiture (an amalgamation of temperature and humidity), which is then used as a key determinant of heating and cooling decisions. Somewhat more impressive are the fusion facilities that sit at the front-end of modern surveillance systems that allow reconnaissance to be carried out simultaneously on several different dimensions: visual, infrared, auditory, electrical (spectral), etc. Because the sensors in such platforms deliver data of different origins, the input fusion task is to combine the several output sets elements into coherent multidimensional portrait of the object of interest (Hall and Llinas, 2001). In some cases it's the mere mobility of sensors, not their variety, that raises the requirement for a data fusion facility. The task in these cases is to converge on a composite real-time construct by collating observations taken by a dispersed array set of similarly-configured sensors operating at different scanning-angles and/or distances. In these cases, readings taken from different perspectives are treated as data of different origins, with the data fusion function usually falling to Multiresolution Integration Algorithms like those designed to conjoin inputs from migratory agents embedded in distributed sensor networks (Qi, Iyengar and Chakrabarty, 2001).

6.2 GIS Constructs and Templating Tactics

Administrative applications requiring a real-time (dynamic) resource allocation run the gamut from prosaic activities like just-in-time inventory replenishment to more dramatic functions like disaster relief or the fielding of rapid deployment forces in the hopes of interdicting looming civil crises. If relational model-base structures are to properly support such applications, they should be set up to operate on GIS (Geographic Information Systems) type constructs. These are becoming ever more widely recognized as a prime medium for portraying phenomena that are multifaceted, protean and have a geospatial or textural-topological aspect to them (Arctur and Zeiler, 2004; Kropla 2005). And indeed, in addition to their traditional command of cartographic, demographic and ecological applications, GIS constructs now stand as primary sources of decision predicates in a variety of areas: Civil engineering, urban planning, transportation system architecture, epidemiology, military targeting, disaster management (e.g., evacuation routing, containment of oil spills), location analysis (Revelle and Eiselt, 2005).

Of most pertinence to relational model-base aided adaptive-dynamic management are GIS configured as composite mapping constructs (Malczewski 1999). Composite mapping constructs are intended to show how various properties of interest are distributed within some bounded domain (model space), and also how their distributions might change in response to natural or introduced initiatives. Towards this end, each of the properties of a composite mapping construct would correspond to one of the facets of the subject phenomenon. Each of these properties is then assigned to a separable layer, where it can be depicted in terms of a density distribution. For suitably-configured GIS constructs, input integration (data fusion) thus consists of conjunctions established among different sets of density distribution data (or, less abstractly, the interleaving, infusion or superimposition of property-specific expository layers). As a further and particularly technically appealing possibility, composite GIS might be constructed so that each of their separable layers corresponds to a factor (determinant, state-variable) over which some decision model is defined. These layers qua decision dimensions would then become the targets of real time information acquisition efforts.

This may, at its most menial, merely require GIS constructs to be reconfigurable, so that density distributions can be altered and properties added, deleted or repositioned relative to one another. So, for example, a GIS construct covering a wetland ecosystem could be tuned to use real-time survey data to chart changes in the distribution of an index-species, along with any alterations in ancillary factors like water chemistry, comings and goings of water birds, frequency and the intensity of human intrusions. Thus, as is increasingly recognized by organization or agencies with ecological interests or missions, emendable GIS constructs can provide a regularly refreshed complex of predicates to feed multicriteria-multiobjective conservancy management models or to inform local land-use decisions with sets of at least partially competitive stake-holders (Evans, VanWey & Moran, 2005).

GIS constructs can also host a special class of real-time referents that function as templates. As an adjunct to adaptive-reactive management support systems, a template is an only partially-elaborated model that's intended to reduce the volume of decision-related analytical requirements that need to be met in real-time. It does so by trying to answer, in advance, for as many as possible of the requirements associated with a decision, in as much detail as possible. Templates thus play a role in adaptive/reactive management systems similar to that played by contingencies in planning-based schemes. However, as appendages to GIS models, templates are formulated as graphic constructs that are designed to be superimposed ¾singly or in company with other templates¾ on a background map. Templates are perhaps of most obvious pragmatic import for organizations having emergency or crisis management missions, where they can promise to both increase responsiveness and preserve the rationality of decision commitments.

As an illustration of the utility of templates, consider a situation where there has been a serious accidental discharge of toxic chemicals at an exurban industrial facility. Suppose also that, an array of templates had been developed that described the generic diffusion characteristics of each of the different chemical compounds the plant produces, as derived from plume projections developed using modern Atmospheric Dispersion Modeling methods (Barratt, 2001). That is, each template is a graphic portrayal of the expected behavior of a particular chemical compound given those properties that constitute constants (volatility, decomposability, particulate predispositions, etc.). As they stand, these templates are incomplete. Missing are situation-specific details such as the exact composition, location and magnitude of a release and, of course, details on contextual variables like wind velocity and humidity, etc. Thus, given a suitable set of templates, what mainly remains to be done is to get a real-time portrayal of an actual release event is to interpose particulars about actual/emergent situational conditions as they become apparent. For plume projection problems, then, templates can mean that the deep structural substance of diffusion models should already be in place, so that the development of actionable managerial scripts for evacuation, containment and damage control demands only the introduction of superficialities.