Appendix. Supplementary data

This appendix describes methods used to evaluate performance of our Non Reference Method (NRM) DustTrak DRX monitors in our study area (Figure 1) and our approach to data adjustments. This is based on a co-located air monitoring campaign which collected data for an inter-instrument comparisonwhich evaluates how our DRX data compare with a Tapered Element Oscillating Microbalance (TEOM) when monitoring the same atmosphere. Resulting data are used to assess performance and form the basis of an inter-instrument calibration model. This evaluation was not conducted to assess the suitability of the DRXs to serve as either federal reference method (FRM) or federal equivalent method (FEM) instruments. Our approach was established with guidance from the US Environmental Protection Agency documentation below:

  • USEPA Office of Air Quality Planning and Standards Technical Note - PM2.5 Continuous Monitor Comparability Assessment:
  • EPA's Air Sensor Guidebook:

This appendix is organized in the following sections: co-located monitoring campaign and inter-instrument calibration model.

Co-located Monitoring Campaign

Monitoring Site

Comparison data was collected at AQS Site 45-019-0049, a SC Department of Health and Environmental Control (DHEC)operated air monitoring station located on the western side of the Charleston peninsula near downtown Charleston(Figure 1; Site F). The AQS site supports the required PM2.5 monitors for the MSA andthe sample inlets are 28 meters from the nearest road. Historically, PM2.5 concentrations at this site have been relatively low when compared to other sites in South Carolina as the location is in close proximity to the Ashley River and thus we perceive this as a ‘clean’ site comparison. For more detail on this site please see:

Data Collection

We collected 1hr data with our five DRXs at CPW during a window from December 3, 2015 through January 12, 2016. A 1hr interval was chosen to correspond with the resolution of the TEOM and performed our comparison at this time of year because weather conditions are generally mild and relative humidity is low for the region. To perform the comparison, each DRX was placed in individual environmental enclosures and mounted so that sample inlets where 1.5-2 meters off the ground. It is important to note that DRXs were deployed for a minimum of 2 weeks; however, instruments did monitor for variable periods and thus may not have had overlapping windows with all other DRXs. Data were retrieved via USB download during the co-located study and were stored on a secure local server at the Medical University of South Carolina (MUSC). The data was then loaded into the R Statistical Environment for processing and analysis. It is important to note that federal monitors only collect data for PM2.5 at this site and thus comparison to our size fractions was not possible.

Evaluation

We evaluate the relationship between our DRXs and the FEM TEOM data using scatterplots that present the regression equation, bivariate correlations (R), root-mean-square-error (RMSE), number of monitored hours (n), and the average bias (DRX – TEOM) during the monitoring period (Figure 1A). A 1:1 line is drawn on the regression relationship to quickly assess if data points are above, below, or straddling the 1:1 line. The NRM is presented on the X-axis, while the FEM method is presented on the Y-axis.

Figure 1A: PM2.5 comparability assessment for NRM DRXs using data collected during co-located monitoring at AQS Site 45-019-0049.

Linear regressions reveal a positive range of slopes (β1) that estimate for every 1 µg/m3 increase in NRM data the FEM data increased by a range of 2.34-4.13 µg/m3 (e.g., For Site B β1=3.57). Modest variability was also observed in intercepts (β0) that reflect different underlying means for the TEOM data used in the models. Bivariate correlations (Rs) show that our monitors performed reasonably well in capturing TEOM variability, as Rs ranged from 0.65 to 0.79, respectively. Root-mean-square-errors (RMSE) show relatively similar performance of our model predictions as values ranged from 1.4 to 2.5, with the monitor deployed at Site E performing best. The number of hours observed (n) varied from a minimum of 119 (5 days) to a maximum of 848 (35 days). The average bias ranged from -2.5 to 0.58, respectively. Background conditions monitoringperiod and duration differed for each instrument comparison period and the mean PM2.5 concentrations during the evaluation period was between 4 and 6 µg/m3. Overall, the performance of our DRXs during this co-located campaign were deemed satisfactory; however, further co-located monitoring is planned as comparison to multiple size fractions is desired.

Inter-Instrument Calibration Model (DRX≈TEOM)

Based on the evaluation results it is clear that our instruments demonstrate bias when compared to TEOM data. To adjust for this bias we develop an inter-instrument calibration model using our co-located data within the framework of a generalized additive model (GAM) in order to develop a non-linear response curve and instrument specific offsets to adjust our DRX PM2.5 data. More specifically, this model can be written as:

where Yis the hourly FEM PM2.5 concentration measured at time iduring co-located monitoring period jwith DRX instrument k, α is the intercept term; DRX is the linear effect of the kth DRX instrument, sis the smooth functionrepresenting NRM PM2.5 measurements collected at time t during monitoring period j by monitor k, and εij represents the error term.

Figure 2A: Panel (a) presents the nonlinear calibration curve estimated by our GAM. Panel (b) presents the DRX specific linear effects estimated by our GAM.

Results identified a somewhat nonlinear function between our DRX measurements and TEOM measures (Figure 2A) that indicates a positive relationship between data collected by the two methods. Comparison of raw data with our adjusted data reveal that the model compresses the range of our data and removes overall bias (Figure 2A). Finally, the adjusted R2was 0.56 and56% of our response deviance was explained by our nonlinear response function.

1