中国高校课件下载中心 》 教学资源 》 大学文库

《计量经济学》课程教学资源(PPT课件讲稿,英文版)Chapter 10 Multicollinearity - What Happens if Explanatory Variables are Correlated

文档信息
资源类别:文库
文档格式:PPT
文档页数:28
文件大小:169.5KB
团购合买:点击进入团购
内容简介
One of the CLRM assumptions is: there is no perfect multicollinearity-no exact linear relationships among explanatory variables, Xs, in a multiple regression. In practice, one rarely encounters perfect multicollinearity, but cases of near or very high multicollinearity where explanatory variables are approximately linearly related frequently arise in many applications.
刷新页面文档预览

Chapter 10 Multicollinearity: What Happens if Explanatory Variables are Correlated

Chapter 10 Multicollinearity: What Happens if Explanatory Variables are Correlated

r One of the CLRM assumptions is there is no perfect multicollinearity--no exact linear relationships among explanatory variables, XS, in a multiple regression In practice, one rarely encounters perfect multicollinearity, but cases of near or very high multicollinearity where explanatory variables are approximately linearly related frequently arise in many applications

One of the CLRM assumptions is: there is no perfect multicollinearity—no exact linear relationships among explanatory variables, Xs, in a multiple regression. In practice, one rarely encounters perfect multicollinearity, but cases of near or very high multicollinearity where explanatory variables are approximately linearly related frequently arise in many applications

The objects of this chapter. OThe Nature of multicollinearity o ls multicollinearity really a problem? o The theoretical consequences of multicollinearity; o How to detect multicollinearity? o The remedial measures which can be used to eliminate multicollinearity

The objects of this chapter: ●The Nature of multicollinearity; ● Is multicollinearity really a problem? ● The theoretical consequences of multicollinearity; ● How to detect multicollinearity? ● The remedial measures which can be used to eliminate multicollinearity

10. 1: The Nature of Multicollinearity The Case of Perfect Multicollinearity rIn cases of perfect linear relationship or perfect multicollinearity among explanatory variables, we cannot obtain unique estimates of all parameters. And since we cannot obtain their unique estimates, we cannot draw any statistical inferences i.e hypothesis testing) about them from a given sample

10.1: The Nature of Multicollinearity: The Case of Perfect Multicollinearity In cases of perfect linear relationship or perfect multicollinearity among explanatory variables, we cannot obtain unique estimates of all parameters. And since we cannot obtain their unique estimates, we cannot draw any statistical inferences (i.e., hypothesis testing) about them from a given sample

Y=A1+A2×2+A3×3+g Transformation:×31=300-2×2 Y=A1+A2X2+A3(300-2×2)+ =(A1+300A3)+(A2-2A3)×2+u1 =C1+C2×2+ui Estimation get the OlS estimators C1=A1+300A3,C2=A2-2A3 So from the estimators of Cl, c2, we can not get the estimators of A, A2 and A3

Yi=A1+A2X2i+A3X3i+μi Transformation: X3i =300-2X2i Yi=A1+A2X2i+A3 ( 300-2X2i ) +μi =(A1+300 A3 )+(A2 -2 A3 ) X2i +μi = C1 + C2 X2i +μi Estimation: get the OLS estimators C1 =A1+300 A3 , C2 =A2 -2 A3, So from the estimators of C1 , C2 , we can not get the estimators of A1 , A2 and A3

That is in cases of perfect multicollinearity, estimation and hypothesis testing about individual regression coefficients in a multiple regression are not possible. We can just obtain estimates of a near combination of the original coefficients, but not each of them individuall

That is : in cases of perfect multicollinearity, estimation and hypothesis testing about individual regression coefficients in a multiple regression are not possible. We can just obtain estimates of a linear combination of the original coefficients, but not each of them individually

10.2 The Case of Near, or Imperfect or High Multicollinearity wHen we talk about multicollinearity we usually refer it to imperfect multicollinearity X:=B, +BX:+e

10.2 The Case of Near, or Imperfect, or High Multicollinearity When we talk about multicollinearity, we usually refer it to imperfect multicollinearity. X3i=B1+B2X2i+ei

r If there are just two explanatory variables. the coefficient of correlation r can be used as a measure of the degree or strength of collinearity. But if more than two explanatory variables are involved as we will show later the coefficient of correlation may not be an adequate measure of collinearity

If there are just two explanatory variables, the coefficient of correlation r can be used as a measure of the degree or strength of collinearity. But if more than two explanatory variables are involved, as we will show later, the coefficient of correlation may not be an adequate measure of collinearity

10.3 Theoretical Consequences of Multicollinearity r Note: we consider only the case of imperfect multicollinearity r When collinearity is not perfect, OLS estimators still remain BLUE even though one or more of the partial regression coefficients in a multiple regression can be individually statistically insignificant

10.3 Theoretical Consequences of Multicollinearity Note: we consider only the case of imperfect multicollinearity When collinearity is not perfect, OLS estimators still remain BLUE even though one or more of the partial regression coefficients in a multiple regression can be individually statistically insignificant

r1. OLS estimators are unbiased But unbiasedness is a repeated sampling property. In reality, we rarely have the luxury of replicating samples. r2. LS estimators have minimum variance But this does not mean however that the variance of an OLS estimator will be small in any given sample, minimum variance does not mean that every numerical value of the variance will be small 3. Multicollinearity is essentially a sample (regression)phenomenon

1. OLS estimators are unbiased. But unbiasedness is a repeated sampling property. In reality, we rarely have the luxury of replicating samples. 2. OLS estimators have minimum variance. But this does not mean, however, that the variance of an OLS estimator will be small in any given sample, minimum variance does not mean that every numerical value of the variance will be small. 3.Multicollinearity is essentially a sample (regression)phenomenon

共28页,试读已结束,阅读完整版请下载
刷新页面下载完整文档
VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
注册用户24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
相关文档