Chapter 29. Time Series

Table of Contents

Introduction
Method description
Usage
Data requirements
Model building
Model testing
Model application
Examples
Model building
Model testing
Model application
References

Introduction

In statistics and signal processing, a time series is a sequence of data points, measured typically at successive times, spaced at (often uniform) time intervals. Time series analysis comprises methods that attempt to understand such time series, often either to understand the underlying theory of the data points (where did they come from? what generated them?), or to make predictions. Time series prediction is the use of a model to predict future events based on known past events: to predict future data points before they are measured. Models for time series data can have many forms. Three broad classes of practical importance are the autoregressive (AR) models, the integrated (I) models, and the moving average (MA) models. These three classes depend linearly on previous data points and are treated in more detail in the articles autoregressive moving average models (ARMA) and autoregressive integrated moving average (ARIMA). Non-linear dependence on previous data points is of interest because of the possibility of producing a chaotic time series. A number of different notations are in use for time-series analysis:

is a common notation which specifies a time series which is indexed by the natural numbers. The Time Series Module constructs ARCH (Autoregressive Conditional Heteroskedasticity) and GARCH (Generalized Autoregressive Conditional Heteroskedasticity) models. They are used to predict the variance of the particular error in timepoint t based on behavior of series in previous steps.