Tuesday, June 21, 2011

CHAPTER 13: ECONOMETRIC MODELING: MODEL SPECIFICATION AND DIAGNOSTIC TESTING

ü  Model Selection Criteria
1.      Be data admissible.
2.      Be consistent with theory.
3.      Have weekly exogenous regressors.
4.      Exhibit parameter constancy.
5.      Exhibit data coherency.
6.      Be encompassing.
ü  Types of Specification Errors
1.      Omission of relevant variable(s).
2.      Inclusion of an unnecessary variable(s).
3.      Adopting the wrong functional form.
4.      Errors of measurement.
5.      Incorrect specification of the stochastic error term.
ü  Consequences of Model Specification Errors
·         Underfitting a Model(Omitting a Relevant Variable)
The consequences of omitting variable X3 are as follows:
1.      If the left-out, or omitted, variable X3 is correlated with the included variable X2, that is, r23, the correlation coefficient between the two variables, is nonzero, α1 and α2 are biased as well as inconsistent.
2.      Even if X2 and X3 are not correlated, α1 is biased, although α2 is now unbiased.
3.      The disturbance variance         is incorrectly estimated.
4.      The conventionally measured variance of α2(=         /∑X2i ) is a biased estimator of the variance of the true estimator β2.
5.      In consequence, the usual confidence interval and hypothesis testing procedures are likely to give misleading conclusions about the statistical significance of the estimated parameters.
6.      The forecasts based on the incorrect model and forecast intervals will be unreliable.
·         Inclusion of an Irrelevant Variable(Overfitting a Model)
The consequences of this specification error are as follows:
1.      The OLS estimators of the parameters of the “incorrect” model are all unbiased and consistent, that is, E(α1) = β1, E(α2) = β2 and E(α3) = β3 = 0.
2.      The error variance           is correctly estimated.
3.      The usual confidence interval and hypothesis testing procedures remain valid.
4.      However, the estimated α’s will be generally inefficient, that is, their variances will be generally larger than those of the β’s of the true model.
ü  Test of Specification Errors
·         Detecting the Presence of Unnecessary Variables(Overfitting a Model)
Bottom-up approach(data mining) – starting with a smaller model and expanding it as one goes along.
·         Tests for Omitted Variables and Incorrect Functional Form
-          Examination of Residuals
-          The Durbin-Watson d Statistic Once Again
To use the Durbin-Watson test for detecting model specification error(s) we proceed as follows:
1.      From this assumed model, obtain the OLS residuals.
2.      If it is believed that the assumed model is mis-specified because it excludes a relevant explanatory variable, say, Z from the model, order the residuals obtained in Step 1, according to increasing values of Z.
3.      Compute the d statistic.
4.      From the Durbin-Watson tables, if the estimated d value is significant, then one can accept the hypothesis of model mis-specification.
-          Ramsey’s RESET Test
Steps involved in RESET are as follows:
1.      From the chosen model, obtained the estimated Yi, that is, Yi.
2.      Rerun introducing Yi in some form as an additional regressor(s).
3.      Use F test namely

F = (R2new – R2old)/number of new regressors
                               (1 – R2new)/(n – number of parameters in the new model)
4.      If the computed F value is significant, say, at the 5 percent level, one can accept the hypothesis that the model is mis-specified.
-          Lagrange Multiplier (LM) Test for Adding Variables
1.      Estimate the restricted regression by OLS and obtain the residuals, ui.
2.      If in fact the unrestricted regression

Yi = β1 + β2Xi + β3Xi2 + β4Xi3 + ui
             is the true regression, the residuals obtained in:

Yi = λ1 + λ2Xi + u3i
             should be related to the squared and cubed output terms, that is, Xi2 and Xi3.
3.      This suggests that we regress the ui obtained in step 1 on all the regressors, which in the present case means

ui = α1 + α2Xi + α3Xi2 + α4Xi3 + vi
             where vi is an error term with the usual properties.
4.      Symbolically, we write

nR2 asy X2(number of restrictions)
5.      If the chi-square value obtained in nR2 asy X2 exceeds the critical chi-square value at the chosen level of significance, we reject the restricted regression.
ü  Errors of Measurement
·         Errors of Measurement in the Dependent Variable Y
Although the errors of measurement in the dependent variable still give unbiased estimates of the parameters and their variances, the estimated variances are now larger in the case where there are no such errors of measurement.
ü  Tests of Non-nested Hypothesis
1.      Discrimination Approach – where given two or more competing models, one chooses a model based on some criteria of goodness of fit.
2.      Discerning Approach – where, in investigating one model, we take into account information provided by other models.
ü  Model Selection Criteria
·         The R2 Criterion

R2 = ESS = 1 – RSS
         TSS           TSS
·         Adjusted R2

R2 = 1 – RSS/ (n – k) = 1 – (1 – R2)n – 1
               TSS/ (n – 1)                         n – k
·         Alkalke Information Criterion (AIC)

AIC = e2kln ∑ui2 = e2kln RSS
                       n                    n

ln AIC =     2k      +  ln    RSS
                   n                      n
·         Schwarz Information Criterion (SIC)

SIC = nkln ∑ui2 = nkln RSS
                     n                  n

ln SIC = k ln n   +    ln   RSS
              n                         n
·         Mallows’s Cp Criterion

Cp = RSSp – (n – 2p)


E(Cp)        (n – p)        - (n – 2p)     p


ü  Additional Topics in Econometric Modeling
·         Outliers, Leverage and Influence
Defined as an observation with a “large residual”
·         Recursive Least Squares
·         Chow’s Prediction Failure Test
ü  A Word to the Practitioner
“Ten Commandments of Applied Econometrics”
1.      Thou shalt use common sense and economic theory.
2.      Thou shalt ask the right questions.
3.      Thou shalt know the context.
4.      Thou shalt inspect the data.
5.      Thou shalt not worship complexity.
6.      Thou shalt look long and hard at thy results.
7.      Thou shalt beware the costs of data mining.
8.      Thou shalt be willing to compromise.
9.      Thou shalt not confuse significance with substance.
10.  Thou shalt confess in the presence of sensitivity.









2 comments:

  1. Thank you for sharing the article. I find it interesting. I hope to see more articles like this from you. We offer online printer service and demo installation support for info please visit our websiteI am amazed by the way you have explained things in this article. This article is quite interesting and I am looking forward to reading more of your posts. Thanks for sharing this article with us.
    TrendMiner
    Jack Tucker
    Cross Raven
    Akonter
    TrendMiner
    TrendMiner
    FetaBook
    FateBook
    TrendMiners

    ReplyDelete
  2. Thanks alot dear..stay happy always

    ReplyDelete