Главная страница Случайная страница КАТЕГОРИИ: АвтомобилиАстрономияБиологияГеографияДом и садДругие языкиДругоеИнформатикаИсторияКультураЛитератураЛогикаМатематикаМедицинаМеталлургияМеханикаОбразованиеОхрана трудаПедагогикаПолитикаПравоПсихологияРелигияРиторикаСоциологияСпортСтроительствоТехнологияТуризмФизикаФилософияФинансыХимияЧерчениеЭкологияЭкономикаЭлектроника |
In the model
In classical linear regression model, choosing the variables in the model is an important aspect. In this, the understanding the problem of multicollinearity need considerable attention.
The term multicollinearity refers to the situation where two or more explanatory variables are linearly related or correlated. If the explanatory variables are perfectly linearly related, it is impossible to obtain numerical values of each parameter separately and the method of OLS breaks down.
Since economic data is obtained from uncontrolled experiment, there is some degree of inter-correlation among the explanatory variables.
However, if the variables are highly correlated, it becomes difficult to disentangle the separate effects of each of the explanatory variable and may impair the accuracy and stability of the parameter estimates.
For time series data, stationary is the first fundamental statistical property that most statistical models require. Test of stationary is based on the so called autocorrelation function and or unit root test. Unit root test can be conducted by utilizing Dickey-Fuller (DF) test, Augmented Dickey-Fuller (ADF) test and Philips-Parron test among others.
In this case we may observe high , which may not reflect the true degree of association between them, but simply the common trend present in them. This situation exemplifies the problem of spurious regression.
The trend variable may be deterministic or stochastic. The trend is deterministic if it is perfectly predictable. But if the trend line itself is shifting with respect to time the series will exhibit a stochastic trend.
With a deterministic trend variables can be made stationary by including a trend variable in the regression. With a stochastic trend tests for cointegration and nonstationarity are needed.
We could also make data transformation that can change nonstationary time series into a stationary one. The most common transformation is that of log transformation when the standard deviation is believed to be directly proportional to the mean.
One can remove the effect of linear trend in the series by performing a first difference transformation. To remove a quadratic trend, we need differencing twice.
If nonstationarity enters through the variance of a process, then heteroschedasticity becomes a problem. One way to counter counteract this effect is by performing a power transformation to render a time varying variance constant and or render a nonnormal process as normal.
Most commonly used group of transformations is the Box-Cox family of power transformation.
|