WiFi Test Bed Phase 2
Experimentation Report
Version 1.0
4 August 2004
Prepared under Subcontract SC03-089-252 with L-3 ComCept, Contract Data Requirements List (CDRL) item A002, WiFi Experimentation Report
Prepared by:
Timothy X. Brown, University of Colorado at Boulder
303-492-1630
1INTRODUCTION
1.1Purpose
This document provides the results of experimentation on a WiFi-based (802.11b) Wireless Local Area Network (WLAN) test bed made up of terrestrial and airborne nodes. This WiFi Test Bed Experimentation Report is the result of “Phase 2” activities associated with experimentation planning, deployment, and initial testing. The initial testing is comprised of a subset of the experiments that will be completed in “Phase 3.” This document describes the results of this initial testing and their implication for further testing.
An overview of the test bed development and deployment is given in Section 2. The measurement approach is discussed in Section 3. The results of experimentation are described in Section 4, and developments necessary for final testing are listed in Section5.
1.2Background
Communication networks between and through aerial vehicles are a mainstay of current battlefield communications. Present systems use specialized high-cost radios in designated military radio bands. Current aerial vehicles are also high-cost manned or unmanned systems.
L-3 ComCept Inc. has contracted with the Air Force Materiel Command (AFMC), Aeronautical Systems Center (ASC), Special Projects (ASC/RAB) to establish and manage a Wireless Communications Test Bed project for the purpose of assessing a WLAN made up of terrestrial and airborne nodes operating with WiFi-based (802.11b) communications. The University of Colorado has been subcontracted to design, install and operate the test bed made up of Commercial Off-The-Shelf (COTS) equipment, and to integrate and operate Unmanned Aerial Vehicles (UAVs) which will interact with it. The network shall support rapidly deployed mobile troops that may be isolated from each other, allow for ad hoc connectivity, and require broadband connection to a NetworkOperationsCenter. Experiments are to be performed to measure and report on the performance and effectiveness of the test bed communications capabilities.
The Wireless Communications Test Bed project is being executed in phases. The objectives and dates associated with each phase are outlined in Figure 1.
1.3Objective
The objective of the wireless communications test-bed effort is to deploy and test a COTS-based communications network made up of terrestrial and aerial nodes that employ state-of-the-art mobile wireless and Internet Protocol (IP) technology. Figure 2 shows the two main reference scenarios. The solution in the first scenario shall support rapid deployment of mobile troops that may be isolated from each other, and require broadband connectivity to a NetworkOperationsCenter. The solution in the second scenario supports UAV-to-UAV communication for extended UAV operational range and capabilities. Experiments are to demonstrate the potential for rapid deployment of an IP-centric, wireless broadband network that will support both airborne and terrestrial military operations anywhere, anytime.
Figure 1: Project Phases
Figure 2: Test Scenarios
1.4Approach
This effort is developing the communication technology and a test bed that enables its performance to be measured. The communication technology uses standard 802.11b-based transceivers to provide radio connections. A common 802.11 platform has been designed and procured that will be utilized for all ground-based and UAV-based nodes. Special software (routing protocols) developed by the University of Colorado to efficiently manage ad hoc mobile mesh network functionality has been applied to the ad hoc network nodes. The terrestrial and airborne communication devices will form an IP-centric network on an ad hoc basis. Broadband links will be established to a remote NOC location.
The test bed consists of remote monitoring capabilities embedded in each node, and experimental procedures for exercising the communication capabilities under different scenarios. Remote monitoring capabilities will allow for remote users to access data obtained, and to monitor the test site and activity on a real time basis. Packet data traffic in various regimes will be utilized for measuring performance and service support abilities. Typical multimedia applications (messaging, web page download, video, and VoIP) will be evaluated.
A location has been chosen that allows for uninterrupted testing of multiple deployment scenarios. Baseline performance will be established on a ground-to-ground connected configuration. Mobile node impacts will be tested. UAV deployment will allow for its introduction to the theater to be characterized. UAV effectiveness for connecting isolated troops will be evaluated, along with UAV abilities to extend the range of communication.
2Test Bed Development & Deployment Overview
Phase 2 activities consisted of developing radio and UAVs; a network monitoring architecture; a test bed site; and testing procedures. In addition, deployment and checkout activities were accomplished in Phase 2. Much of this is detailed in other documents listed in the Appendix. This section highlights the main developments under this project, and summarizes deployment experiences. Checkout results are addressed in Section 4, Experimental Results.
The radio consists of a COTS single board computer with a PCMCIA 802.11b interface. This is either packaged inside of a custom environmental enclosure for outdoor mounting, or mounted in the UAV. Power can be provided via power over Ethernet, or directly via 12V batteries. In addition, for monitoring purposes a GPS is connected to the computer’s serial port. Eleven radios have been purchased; seven in enclosures and four for mounting in UAVs. The radio runs the Linux operating system. The ad hoc radio protocol is the dynamic source routing (DSR) protocol implemented at CU. The protocol is run on other Linux platforms such as a laptop computer. The radio hardware combined with the routing software is denoted a mesh network radio (MNR). A MNR is shown in Figure 3.
Figure 3: Mesh Network Radio (MNR) (left). MNR mounted in environmental enclosure for vehicle or fixed ground mounting (center). MNR mounted in UAV (foreground right)
The DSR protocol is an ad hoc routing protocol that is a so-called on-demand protocol. The on-demand nature of the protocol means that it only finds and maintains routes to a destination as long as it has packets to send to that destination. Thus control overhead is proportional to network activity and nodes maintain radio quiet when not needed. The CU implementation of the protocol allows us to modify it as needed for the test bed. The protocol is generally a robust protocol for maintaining connectivity between nodes. It does not have any specific quality of service features built in.
The UAV is an adaptation of the Telemaster design using carbon fiber constructed at CU and shown in Figure 4. Max takeoff weight is 14.5kg of which the payload is 4.5kg; more than sufficient for the radio hardware (less than 1kg). The engine is a 5HP single piston that provides a cruise speed of 100kmph. Two planes have been constructed and flown. A third is being constructed. Current control is via standard line-of-site RC. A waypoint flying automated pilot is being installed and tested.
A key development under this project is detailed monitoring capabilities. The architecture is shown in Figure 5. Embedded in the routing protocol is a monitoring agent that collects data on the packets sent and received by each radio node. This data is periodically sent to a gateway node which stores the data on a server. This server allows remote access to the data via a Web-based graphical user interface (GUI). The collected data summarizes packet traffic on a route by route basis as seen by each node that forwards a packet. The data is sent via a reliable protocol that ensures data is collected even if a node becomes isolated for long periods of time. The GUI provides functionality for navigating and visualizing the data collected. The data collection and monitoring system from monitoring agent to the GUI has been implemented. A screenshot of the GUI is shown in Figure 6. It is used during data collection to provide situational awareness of the nodes and the communication between them. The data collected can be replayed through the GUI or the database can be queried directly for post analysis. Data is collected every monitoting interval which is set at 10 seconds. It includes the node positions, the number of packets sent and received by each node on each route, and the routes used.
The test bed site is the Table Mountain National Radio Quiet Zone (NRQZ) owned by the Department of Commerce and operated by the Institute for Telecommunication Sciences (ITS) approximately 10 miles north of Boulder, Colorado. The site is 2.5 miles by 1.5 miles on a raised mesa with no permanent radio transmitters in the vicinity. A map of the site is shown in Figure 7. An aerial photo of the site is shown in Figure 8. An UAV operations area has been developed on the site with airstrip, shelter, and a trailer for equipment storage. Methods have been developed to rapidly deploy the MNR around the test site as needed for specific experiments. Experiment operational procedures have been developed for executing and documenting experiments.
FS-Fixed Site (powered locations)
MNR-Mesh Network Radio
Figure 7: Map of TableMountain with benchmark setup for Phase 2 testing.
The testing procedures consist of preplanning, execution, and post processing. In preplanning, a set of tests to be performed on a day is decided upon, and procedure sheets are filled out for the UAV, radios, and the overall experiments. Equipment is prepared, tested, and packaged for travel to the test site. The experiments are executed over a day. The testing crew meets at CU and departs for the test range. The UAV team meets at the airfield to assemble and ready the plane. The networking crew sets up the radios and establishes connectivity back to the monitoring server. Experiments are executed per the procedures documents and data is collected through filling in of procedure sheets, executing scripts to exercise the network, and through the monitoring software. The experiment is torn down and transported back to CU to be prepared for the next experiment. The data collected is analyzed and processed as needed.
3Measurement Procedure
This section outlines the performance and effectiveness measures and how they are measured in Phase 2. The 9 performance measures and 6 effectiveness measures are listed below.
Table 1: Experimental Measures
Measures of PerformanceData Throughput
Latency (communication delay)
Jitter (delay variation)
Packet Loss, Radio
Packet Loss, Congestion
Communication Availability
Remote Connectivity
Hardware Reliability
Range
/ Measures of Effectiveness
Network Self-forming
Node-failure Recovery
Mobility Impact
Ease of Deployment/Transportability
Ease of Operation
Data, Voice, Video, Web Browsing
The measurement procedures are detailed in the WiFiTest Bed Experimentation Plan. The procedures used in Phase 2 to measure these variables are summarized below.
Data throughput is measured by observing the time to transfer a large file between MNR radios. The program uses TCP and so is a measure of the reliable throughput rate. The Latency, Jitter, and Packet Loss, Radio are measured using the Ping utility which sends one packet every second. This measures the round trip times (RTT) and fraction of packets sent/acknowledged. Packet Loss, Congestion is not measured in Phase 2. Communication Availability is measured in two steps. MNR nodes randomly choose other MNR nodes to Ping for a period of time before choosing a new MNR node. The availability between two nodes is defined as the fraction of ping packets which are successfully sent and acknowledged. Remote connectivity is the Availability of the gateway node. Hardware reliability is a log of problems with hardware that affect completing the testing. Range is defined as the range in which the transfer of a large file can be successfully completed. The distance is measured via GPS data collected from the monitoring.
Network Self forming is measured by recording when a network of nodes that should be connected does or does not form. Node failure recovery is not measured in Phase 2. Mobility impact is measured by making some nodes mobile and measuring the effect on other measures. Ease of Deployment/Transportability is a description of the personnel and hardware that is necessary to transport and set-up the experiments. Ease of Operation is the number of personnel necessary to execute the experiments. Data, Voice, Video, Web Browsing are subjective measures and are not measured in Phase 2.
A key element of Phase 2 was to measure the performance as a function of the number of hops on a route. Mesh networks have a tendency to use any available route. This means that if the first node in a line of 6 nodes communicates with the last node, their route may not be 5 hops. The network may discover shortcuts that bypass intermediate nodes. Moving the nodes further apart will make this more unlikely, but, it will also stretch the distance between adjacent neighbors making these links less reliable. Figure 7 shows the placement of nodes that achieved reliable 5-hop routes from FS1 to MNR1 to MNR2 and so on. Small terrain variations were used to achieve the routing. For instance MNR3 and MNR5 are separated by a small ridge so that traffic must be routed through MNR4 to connect them. This placement of nodes is the baseline placement used in all experiments.
The specific experiments include: performance as a function of the number of hops for the baseline network; performance with two of the nodes (MNR2 and MNR3) placed on vehicles and driven on the roads as indicated in Figure 7; performance with the baseline network and an additional vehicle node that circuits Table Mountain; and, performance when the network is separated into two disconnected groups (MNR3 is shut down). Every experiment is comprised of parallel experiments with and without the UAV in the air. This enables the effectiveness of the UAV to be assessed. In addition, experiments measure the UAV-Ground and UAV-UAV ranges. The relationship between these experiments is shown in Figure 9.
Figure 9: Relationship between experiments. Numbers refer to the section where they are described in the WiFi Experimental Plan.
4Experimental Results:
This section describes the experimental results for each of the measures of performance and effectiveness.
4.1Data Throughput
The throughput for the multi-hop experiment is shown in Figure 10. The one hop throughput is approximately 1400kbps. The MNR in ad hoc mode uses the 2Mbps channel rate. The throughput is measured with a TCP file transfer. Thus the traffic includes data packets which contribute to the throughput plus packet acknowledgements which are part of the reliable TCP transfer protocol, but do not directly contribute to throughput. Further, the 802.11 protocol also includes overhead. Thus 1300Kbps is close to the maximum throughput that we could ever expect from the MNR. The throughput decreases with the number of hops. This is due to sharing of the medium between nearby nodes. For instance the two hop throughput is half the 1 hop throughput since the middle node must divide its time between receiving from the source and sending to the destination. At 5 hops, the throughput is still over 100kbps. The Fixed node experiment shows one standard deviation error bars based on three different measurements set up on three different days.
Figure 10: Throughput as a function of the number of hops.
One issue in these Phase 2 experiments was the power settings on the UAV. It was found after the fact that the output amplifier was not consistently set at maximum power. Therefore the UAV often did not participate in the experiments as can be seen in the experiments where the throughput does not depend on the presence of the UAV. This power issue was traced to an unforeseen firmware conflict.
Procedurally the mobile experiments need refined. It was found that when MNR2 and MNR3 are made the mobile nodes, it is unlikely for them to be in the correct position to connect MNR1 with MNR4. Thus throughput to MNR4 and MNR5 could not be measured. The throughput to the mobile nodes moving at speeds up to 40kmph was 130kbps. This rate reflects the dynamic routing to MNR2 and MNR3. One or the other would be closer to MNR1 causing routes to fluctuate. At times both were far from MNR1 and only poor links were available. Despite these effects, connectivity was maintained and the rate was kept above 130kbps.
4.2Delay and Delay Variation
The delay shows an average increase of 10msec per hop. This is the round trip delay and the one way delay would be approximately half of this value. The results are only for the fixed network since in all the other cases the route may or may not be for the nominal number of hops due to route shortening through the UAV or one of the mobile nodes. The main observation is that the delay per hop is small: in the 10’s of msec. This should not affect elastic applications such as email or web browsing. Delay is more important for real time applications. For instance voice applications have a target maximum RTT of 200msec.
The graph also shows one standard deviation error bars based on four different measurements set up on three different days. The average delay variation is small suggesting reliable performance on average. The maximum delay is relatively high, but, from the relative minimum, average, and maximum values it can be inferred that at most 15% of the packets can have delays near this maximum value. Thus, most of the packets have small delays near the average value.