Skip to main content

Time Series Analysis

  • Chapter
  • First Online:
Advanced Statistics for the Behavioral Sciences
  • 1483 Accesses

Abstract

Many phenomena unfold over time. For example, stock prices rise and fall, diseases run their course, and relationships ebb and flow. Occurrences like these create a time series — a sequence of observations identified by the order in which they occur. Owing to properties of inertia and persistence, the observations in a time series tend to change slowly and are frequently characterized by dependencies. Consequently, previous observations provide information about present observations, and ordinary least squares is an inefficient way to estimate the data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    There is an important difference between the analyses we covered in Chap. 6 and the ones we will be discussing here. Earlier, we modeled the dependencies in the errors from a linear regression model using an equation of the following form: et = et−1 + ut. In this chapter, we model dependencies in the observations themselves. For example, yt = yt−1 + ut (among other possibilities).

  2. 2.

    The \( \mathcal{R} \) code needed to produce the figures is provided at the end of this chapter.

  3. 3.

    To be precise, this definition is known as weak stationarity. A strongly stationary process is one for which all moments of a distribution are constant across time A TSA requires only that the first two moments of a distribution (i.e., its mean and variance) are constant across time, so it represents a weakly stationary process. Henceforth, the term stationary will be used to refer to a weakly stationary process.

  4. 4.

    The distribution in our example is normal with mean zero, so the process could also be called Gaussian white noise.

  5. 5.

    The Dickey-Fuller test can be modified to include an intercept and a linear term. In \( \mathcal{R} \), the augmented test can be found in the urca package (ur.df) or tseries package (adf.test). These packages also offer other tests of stationarity not discussed here.

  6. 6.

    The standard errors described in Equation 13.13 are often referred to as Bartlett standard errors.

  7. 7.

    It is customary to use 2 rather than 1.96 when forming the 95% confidence interval of an autocorrelation, and many statistical packages, including \( \mathcal{R} \), use Equation (13.14) when plotting confidence intervals for all autocorrelations, not just the first.

  8. 8.

    Later we will show that Figure 13.1 plots the impulse responses for the four hypothetical couples introduced at the outset of this chapter.

  9. 9.

    The signs used to describe MA processes differ among statistical packages. \( \mathcal{R} \) uses a positive coefficient to describe the pattern seen in the top portion of Figure 13.4 and a negative coefficient to describe the pattern seen in the middle portion of the figure so I am following this convention.

  10. 10.

    The acf function in \( \mathcal{R} \) plots the first autocorrelation, which is always 1. Hardly anyone (myself included) finds this informative, so I have removed it from all figures in this chapter.

  11. 11.

    A stationary process is sometimes referred to as a causal process.

  12. 12.

    The rule of thumb is that a TSA must have a minimum of 50 observations in order for the estimates to be stable.

  13. 13.

    The nomenclature for the state space representation is not standard, and different textbooks present these equations in slightly different form.

  14. 14.

    The Kronecker product of two matrices, A and B, is a block matrix formed from multiplying B by each element of A. To illustrate:

    $$ \kern0.5em \left[\begin{array}{cc}1& 2\\ {}3& 4\end{array}\right]\otimes \kern0.5em \left[\begin{array}{cc}5& 6\\ {}7& 8\end{array}\right]=\left[\begin{array}{cc}1\ast \left[\begin{array}{cc}5& 6\\ {}7& 8\end{array}\right]& 2\ast \left[\begin{array}{cc}5& 6\\ {}7& 8\end{array}\right]\\ {}3\ast \left[\begin{array}{cc}5& 6\\ {}7& 8\end{array}\right]& 4\ast \left[\begin{array}{cc}5& 6\\ {}7& 8\end{array}\right]\end{array}\right]=\left[\begin{array}{cccc}5& 6& 10& 12\\ {}7& 8& 14& 16\\ {}15& 18& 20& 24\\ {}21& 24& 28& 32\end{array}\right] $$
  15. 15.

    If initial parameters are not specified, the code generates a p + q length vector of small values (.1s) that often suffice.

References

  • Box, G., & Jenkins, G. (1970). Time series analysis: Forecasting and control. San Francisco: Holden-Day.

    MATH  Google Scholar 

  • Dickey, D. A., & Fuller, W. A. (1979). Distribution of the estimators for autoregressive time series with a unit root. Journal of the American Statistical Association, 74, 427–431.

    MathSciNet  MATH  Google Scholar 

  • Ljung, G., & Box, G. (1978). On a measure of lack of fit in time series models. Biometrika, 65, 297–303.

    Article  Google Scholar 

  • Yule, G. U. (1927). On a method of investigating periodicities in disturbed series, with special reference to wolfer’s sunspot numbers. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 226, 267–298.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Brown, J.D. (2018). Time Series Analysis. In: Advanced Statistics for the Behavioral Sciences. Springer, Cham. https://doi.org/10.1007/978-3-319-93549-2_13

Download citation

Publish with us

Policies and ethics