Definitions

  1. (2 points) Define white noise and explain how it is related to the time-series modeling process.
    A completely random series of numbers with no patterns or structure

No autocorrelation

White noise can’t be modelled or forecasted

  1. (2 points)Compare and contrast residuals and forecast errors.
    Training set: data used to fit the model.This generates residuals which are what is left over after fitting a model. Difference between actual values and the fitted values.
    Testing set: data used to test the model.This generates forecast errors which are the difference between observed values and their forecasts.
  2. (2 points)Explain how moving averages (or any smoothing technique) can be used to identify the trend\cycle component in decomposition analysis.
    4 main components of a time series are trend, cycle, season, and remainder. Usually we put trend and cycle together into one component.
    By smoothing, we canaverage out the seasonal and remainder components.We are left with the trend\cycle component.
    If we are using moving averages, then we need to choose the order of the moving average so that it corresponds to the seasonality of the data. Otherwise, we will filter out the remainder but will still have a seasonal pattern in our time series.

Problem

  1. Use the following time-series graph to describe:
  2. (1 point) Trend
    We have a global, positive trend in the data.
  3. (1 point) Cyclical
    There doesn’t appear to be any cycle in the data because the trend is steadily upward.
  4. (1 point) Seasonality

A seasonal pattern repeats itself every 12 months because we have monthly data. The seasonal pattern is heteroscedastic.

  1. First define and explain the ACF and then use the following autocorrelation function to describe
  1. (2 points) Define and explain ACF in general.

Autocorrelation is the relationship between lagged values of the same time series. It is the linear relationship as measured by a correlation coefficient between a time series and the time series lagged k periods. The autocorrelation function graphs the correlation for each different lagged value k.

  1. (1 point) Trend

The above ACF declines very slowly which is the normal pattern when we have a trend in the data. Observations that are close to each other in time tend to be very similar and result in high correlations.

  1. (1 point) Cyclical
    Cycle doesn’t have a regular, known frequency. For this reason we can’t detect cyclical patterns from an ACF.
  2. (1 point) Seasonality
    The bump close to 12 suggests seasonality. It would be more pronounced with we differenced the data to remove the trend.
  1. The Box-Cox transformation is given by

A suggested value results in the following time-series plots:

  1. (1 point) Explain why the Box-Cox transformation was needed.
    The Box-Cox transformation was needed to make the very heteroscedastic seasonal pattern more uniform, or homoscedastic.
  2. (2 points) Explain how the Box-Cox transformation works in general.
    Box-Cox is a power transformation. The value for lambda is the exponent. When lambda is 2, then we are squaring the data. When lambda equals -1, we are taking the reciprocal of the time series. When lambda equals zero, then we are taking the natural logarithm of the time series.
  3. For the transformed data, analyze the
  4. (1 point) Trend

Global positive trend

  1. (1 point) Cyclical
    No cycle
  2. (1 point) Seasonal
    Homoscedastic seasonality
  1. The following error summary table results from using the last two years as a testing set.
  1. (2 points) Explain the difference between training and testing sets.
    Training set: fit the model
    Testing set: use the model to make forecasts and then compare the forecasts with actual data.
  2. (1 point) Explain how forecast errors are calculated.
    Forecast errors are the difference between an actual observation and its corresponding forecast in the testing set.
  3. (2 points) Explain how you calculate the RMSE and the MAD.
    RMSE: square the errors, take the average of the squares, and then take the square root
    MAE: average of the absolute value of the errors
  4. Which model is best? Why?

(1 point) The seasonal naïve model has the lowest RMSE and MAE

  1. The following graph summarizes the STL decomposition of airline passengers:

  1. (4 Points) Explain the STL decomposition steps that generate each panel in the graph.

First, use loess smoothing to isolate the trend\cycle component.

Second, subtract the smoothed values to get seasonal and remainder

Third, smooth or summarize the remainder and seasonal values to isolate the seasonal component.

Fourth, subtract the trend\cycle and seasonal components from the original data to get the remainder.

  1. Analyze the trend, seasonal, and remainder components.
  2. (1 point) Trend

Linear, global trend

  1. (1 point) Seasonal
    Seasonal pattern that is increasing in amplitude
  2. (1 point) Remainder

Remainder does not appear to be white noise because there is clustering of positive and negative residuals.

  1. Compare and contrast the seasonal pattern shown with the global seasonality that would have been estimated using classical decomposition.
    (1 point) Classical decomposition estimates a global seasonal pattern and doesn’t allow for evolution within the seasonal component.
  2. (2 points) Interpret the following seasonal index. Explain how you know that it comes from an additive model.

January is -38 below the overall trend\cycle line

July and August are 113 – 115 above the trend\cycle line

Additive index because it centers on 0. If this were a multiplicative model, then the index would represent percentages and would center on 1.