The GLS Transformation Matrix and a Semi-Recursive Estimator for the Linear Regression Model with Arma Errors
JOHNW. GALBRAITHAND VICTORIA ZINDE-WALSH
McGill University
For a general stationary ARMA(p,q)process u we derive the exact form of
the orthogonalizing matrix R such that R'R = C-I, where C = E ( u u f )is the
covariance matrix of u, generalizing the known formulae for A R ( p )processes.
In a linear regression model with an ARMA(p,q)error process, transforming
the data by R yields a regression model with white-noise errors. We also consider
an application to semi-recursive (being recursive for the model parameters,
but not for the parameters of the error process) estimation.
1, INTRODUCTION
There have been many contributions to the literature concerned with estimation
of the linear regression model with some autocorrelated process in the
errors, surveys of which are provided by, for example, Judge et al. [15]. The
main technical difficulties arise in inverting the error covariance matrix, finding
the orthogonalizing transformation for the errors, or evaluating the relevant
quadratic forms in the inverse matrix by other means.
Among results applicable to a general ARMA(p, q) process, we may distinguish
several approaches or classes of result. The numerical approach, exemplified
by Harvey and Phillips [12], seeks maximum likelihood estimates
for the full model through numerical optimization of the likelihood; in [12],
this is achieved by expressing quadratic forms (e.g., in the likelihood function)
through recursive residuals which can be computed by the Kalman filter.
One difficulty in doing so lies in the necessity of initializing the recursive
filtering algorithm. |