STAT 520 (Spring 2010) Lecture 1, January 12, Tuesday

  1. Examples of time series

A time series is a set of observations , each one being recorded at a specified time t. We will focus on discrete-time series for which .

Example 1. Daily data of DJII.

Closing daily values of the Dow-Jones Industrial Index for 251 successive trading days, ending August 26th, 1994.

Example 2. Accidental death data. (DEATHS.TSM)

Approach 1. Classical decomposition

Figure 2: The monthly accidental deaths , 1973-1978, Figure 1-2, p.4.

Figure 3: Deseasonalized series based on , Figure 1-24, p.32

Figure 4: Deseasonalized and detrended series on

ITSM::(Fit)

======

Polynomial Fit:

X(t) = .82602 * (t^2) - 71.817 * t + .99518E+04

Seasonal fit of period = 12

Seasonal components:

1) -.80432E+03

2) -.15217E+04

3) -.73747E+03

4) -.52581E+03

5) .34342E+03

6) .74641E+03

7) .16800E+04

8) .98684E+03

9) -.10877E+03

10) .25831E+03

11) -.25938E+03

12)-57.461

Figure 5: The decomposition fit on

Approach 2: decomposition by differencing

Figure 6: the differenced series , Figure 1-26, p.34

Figure 7: the differenced series , Figure 1-27, p.34

See also other examples in the book. Please read ITSM Tutorial D.1-2.

Example 3.An artificialexample with seasonality, trend, and noise.

Figure 8: The time series with only seasonality

Figure 9: The time series with only trend

Figure 10: the time series with only noise

Figure 11: the time series with seasonality, trend and noise

Since I generated this artificial data of 80 observations, I was able to decompose it. How would you arrive at the decomposition without such knowledge?

  1. The objectives of time series analysis

Understanding the dynamics

For example, the accidental deaths data has a representation

Note: We would like to have being stationary.

Prediction

Consider a sequence of observation , how does one predict the value of based on information one has from ?

  1. Some simple time series models

As we see in the artificial example, it is realistic to consider a time series as a sequence of random variables and the observed data is a realization of the joint distributions of a sequence of random variables . To specify the distribution of this sequence of variables is time series modeling. We examine a few models that have zero mean, and models with trend and /or seasonality.

Example: White noise: .

i.e., uncorrelated, , Var

Example: i.i.d. noise: .

Example: Gaussian white noise.

Example: Binary i.i.d.

Example: Random walk , where being i.i.d. noise.

Differencing: .

  1. Models with trend and seasonality

The standard decomposition

of time series into a “trend component”, “seasonal component”, and “random noise component” offers the possibility to describe different kinds of time series data appropriately.

Estimation of trend when there is no seasonality

How does one estimate , assuming that ?

Ordinary Least Squares Estimation. Model as a polynomial function of time t.

This method works extremely well for US population data (USPOP.TSM, Fig1.8, p.11), with a quadratic shape quite visible. One can check that a linear fit is rather poor, while polynomial fits of order 3 is as good as order 2.

For the Lake Huron level data (LAKE.TSM), a linear fit is as good as a quadratic one (Fig1.9, p.11). The residuals exhibit the pattern of runs (Fig1.10, p.12). In 1.4.2, we will discuss a more intelligent way to fit the lake Huron data, via “autoregression”. This data illustrates that the “correctness” of statistical models is relative, as no one knows the exact model that generates the realization sequence. But one seeks a model that can “best” describe the data dynamics.

Harmonic regression is a convenient way to model seasonality. For the accidental death data (DEATHS.TSM), try harmonic regression with polynomial trend of order 2, one can see a much better fit than the one in the book (Fig1.11, p.13, only harmonic, no trend).