计量经济学(一)
计量经济学(一)
Econometrics I
Peking University HSBC School of Business
Professor Ming Guo
Department of Finance
Abstract: This is an intermediate level course in Econometrics. Topics to be studied include specification, estimation, and inference in the context of models that include then extend beyond the standard linear multiple regression framework. After a review of the linear model, we will develop the asymptotic distribution theory necessary for analysis of generalized linear and nonlinear models. We will then turn to instrumental variables, maximum likelihood, and generalized method of moments (GMM). Inference techniques used in the linear regression framework such as t and F tests will be extended to include Wald, Lagrange multiplier and likelihood ratio. Specific modeling frameworks will include the linear regression model and nonlinear regression model.
Prerequisites: Multivariate calculus, matrix algebra, probability and distribution theory, statistical inference, and an introduction to the multiple linear regression model. Appendices A and B in Greene (2008) are assumed. We will survey the parts of Appendix C and Chapter 2 that would have appeared in prerequisite courses. A significant part of this course will focus on the advanced parts of Appendices C and D and Chapters 4 through 7.
Course Requirements: Grades for the course will be based on:
· Prerequisites: Review of Matrix Algebra and Probability and Distribution Theory
· I. The Paradigm of Econometrics: [Chapter 1 (pp. 1-6)] < [chapter 1 (pp. 1-6)]>
FONT-SIZE: 12pt; FONT-FAMILY: 'Times New Roman'; mso-font-kerning: 0pt">
2008-12-10 13:58:00
Econometrics I
Peking University HSBC School of Business
Professor Ming Guo
Department of Finance
Abstract: This is an intermediate level course in Econometrics. Topics to be studied include specification, estimation, and inference in the context of models that include then extend beyond the standard linear multiple regression framework. After a review of the linear model, we will develop the asymptotic distribution theory necessary for analysis of generalized linear and nonlinear models. We will then turn to instrumental variables, maximum likelihood, and generalized method of moments (GMM). Inference techniques used in the linear regression framework such as t and F tests will be extended to include Wald, Lagrange multiplier and likelihood ratio. Specific modeling frameworks will include the linear regression model and nonlinear regression model.
Prerequisites: Multivariate calculus, matrix algebra, probability and distribution theory, statistical inference, and an introduction to the multiple linear regression model. Appendices A and B in Greene (2008) are assumed. We will survey the parts of Appendix C and Chapter 2 that would have appeared in prerequisite courses. A significant part of this course will focus on the advanced parts of Appendices C and D and Chapters 4 through 7.
Course Requirements: Grades for the course will be based on:
- Final exam (60%). It will be given in class.
- Several problem sets (20%).
- Small projects (20%).
- Text: The required text for the course is Greene, W., Econometric Analysis, 6th Edition, Prentice Hall, 2008. (You may use the 5th edition if you prefer.)
- Software: Some of the outside work for this course will involve using a computer. Students may use any computer software that they are familiar with for this purpose.
· Prerequisites: Review of Matrix Algebra and Probability and Distribution Theory
· I. The Paradigm of Econometrics: [Chapter 1 (pp. 1-6)] < [chapter 1 (pp. 1-6)]>
- A. Modeling in economics
- B. Econometrics: statistics, economics, mathematics
- C. Econometric modeling: understanding, prediction, control
- D. Modeling frameworks:
- 1. Bayesian and Classical (frequentist) approaches [14.2, Chapter 18] < [16.2 - 16.2.2, pp. 425-429]>
- 2. Estimation and inference: Nonparametric, semiparametric, parametric [14.3, 14.4]
- E. Estimation and inference in econometrics, methodological issues
- II. The Classical Linear Regression Model. Part 1. Specification and Computation
- . The conditional mean function [B.1-B.3, B.7-B.8]? ,[B.1-B.3, B.7-B.8]>; regression (Waugh)
- B. The classical linear regression model and its functional form
- 1. The linear regression model [2.1] < [2.1]>
- 2. Linear models and intrinsic linearity [2.1-2.3] < [2.1-2.3]>
- 3. Logs and levels, estimating elasticities [2.3.1] < [2.3.1]>
- 4. Functional form and linearity. Transformations and dummy variables [6.1-6.3] < [7.1-7.3]>
- 5. Linearized regression and Taylor series, linearity in economic modelling [2.3.1, 11.1-11.2] < [2.3.1, pp. 162-163]>
- C. Least squares regression [Chapter 3] < [ch. 3]> (Frisch and Waugh, Longley)
- 1. Least squares regression [3.1-3.2] < [3.1-3.2]>
- 2. Partitioned regression and the Frisch-Waugh theorem [3.3, 3.4] < [3.3, 3.4]>
- 3. Applications of partitioned regression: a fixed effects model [9.4.1] < [13.3.1 up to result (13-6)]>
- D. Evaluating the fit of the regression [3.5] < [3.5]>, ANOVA, Adjusted R2 [3.5, 7.4] < [3.5, 8.4]>
- E. Least squares with restrictions [5.3.2 – 5.3.3] < [6.3.2-6.3.3]>
- A. Statistical properties of the least squares estimator in finite samples [4.1-4.5] < [4.1-4.5]>
- 1. Why least squares? [4.2] < [4.2]>
- 2. Sampling distributions [Example 4.1] < [example 4.1]>
- 3. Expectation [4.3] < [4.3]>
- 4. The effects of omitted and superfluous variables - The Omitted Variable Formula (A VIR) [7.2.1-7.2.3] < [8.2.1, 8.2.3]>
- 5. Variance of the least squares estimator [4.4, 7.2.2] < [4.4, 8.2.2]>
- 6. The Gauss-Markov theorem [4.4, 4.5] < [4.4, 4.5]>
- 7. The Least Absolute Deviations Estimator [14.3.2] < [16.3.2]>
- B. Estimating the Variance of the least squares estimator
- 1. Conventional estimation [4.6] < [4.6]>
- 2. The effect of multicollinearity [4.8.1] < [4.9.1]>
- 3. Introduction to bootstrapping; least absolute deviations [17.6, 14.3.2] < [pp. 924-925, 16.3.2]>
- 1. Generalities about sampling distributions [C.2-C.4] < [c.2 - c.4]>
- 2. Sampling distributions and properties of estimators [C.5, 4.3 - 4.5] < [c.5, 4.3 - 4.5]>
- 3. Linear estimation and normality [4.7] < [4.7 and esp. result 4-4 on p. 44, 4.7.1]>
- 4. Efficient estimation, precision, mean squared error [4.4, 7.2.2] < [4.4, thm. 4.2, 8.2.2]>
- 5. Describing the sampling distribution of the estimator - kernel density estimator [4.7, p. 1023, 14.4.1] < [4.7, p. 881, 16.4.1]>
- 1. Standard results for testing [5.1 – 6.4] < [6.1 to 7.3]>
- 2. Structural change [6.4] < [7.4, 7.5]>, Model selection [7.3, 7.4] < [8.3, 8.4]>
- 3. The J test for nonnested models [7.3.3] < [8.3.3]>
- C. The sampling distribution of the least squares coefficient vector [Chapter 4] < [6.6.3]>
- D. Statistical Inference in the linear model [Appendix C, 4.7.1 - 4.7.5, 5.1-5.3, 6.1 - 6.5] < [appendix c, 4.7.1 - 4.7.5, 6.1-6.3, 7.1 to 7.6]> (Greene and Seaks)
- E. Prediction using the linear model [5.6] < [6.6]>
- A. Large sample distributions, asymptotic and limiting distributions [Appendix D]
- B. Basic large sample results for the classical model [4.9] < [5.1-5.3]>
- C. Large sample results for a function of a statistic - the delta method [4.9.4] < [5.2.4]>
- D. Instrumental variables estimation and measurement error [12.1, 12.3, 12.5] < [5.4, 5.6]>
- E. Test procedures for large samples; t, F, chi-squared, Wald statistic [5.3, 5.4] < [6.4, 6.5]>
- F. The Hausman specification test [12.4] < [5.5]>
- G. A test for nonnested models, the J and Cox tests: variables [7.3] < [8.3]>, levels vs. logs [none] < [9.4.3]>
- A. The Box-Cox transformation [11.3.2] < [9.3.2]>
- B. Nonlinear regression and nonlinear least squares [11.1-11.4] < [9.1-9.4]>
- C. Two step estimation [11.6] < [9.5]>
- A. Nonspherical disturbances [8.1] < [10.1]>
- 1. General formulation [8.1-8.3] < [10.1]>
- 2. Heteroscedasticity [8.4] < [11.1]>
- 3. Autocorrelation [19.1-19.2]? < [12.1,12,2]>
- B. Implications for least squares [8.4, 19.5] < [10.2 to 10.3, 11.2, 12.5]>
- 1. Robust covariance matrix estimation [8.3-8.4, 19.5] < [10.3-10.4, 11.3, 12.5-12.6]>
- 2. Bootstrapping [14.3.2] < [16.3.2]>
- C. Testing for nonspherical disturbances [8.5, 19.7] < [11.4, 12.7]>
- D. Generalized least squares and weighted least squares [8.3, 8.6-8.8, 19.5-19.6, 19.8-19.9] < [10.5, 11.5-11.7, 12.8-12.9]>
- 1. Heteroscedasticity [8.8] < [11.7]>1. Heteroscedasticity [8.8] < [11.7]>
- 2. Autocorrelation [19.8-19.9] < [13.1 to 13.7]>
- E. Two step feasible GLS estimation, familiar applications [8.3.2, 8.8, 19.8-19.9] < [10.5.2,11.7, 12.9]>
- A. Instrumental Variables Estimation [Chapter 12] < [5.4]>
- 1. Measurement error [12.5] < [5.4]>
- 2. Lagged dependent variables and autocorrelation [19.9.3] < [12.9.4]>
- 3. Two stage least squares [12.3] < (no reading)>
- B. Maximum likelihood estimation [Chapter 16] < [chapter 17, 11.7, 11.8]>
- 1. Computation [E.2, E.3] < [e.4 and e.5 (both optional)]>
- 2. Covariance matrix estimation [16.4.6] < [17.4.6]>
- 3. GARCH models [19.13] < [11.8]>
- 4. Likelihood ratio, Lagrange multiplier tests [16.6, 8.5.2, 16.9.2, 9.5.3] < [17.5, 11.4.3, 11.6.3, 13.4.3]>
- C. Generalized method of moments (GMM) estimation [Chapter 15] < [chapter 18]>
- III. The Classical Linear Regression Model. Part 2. Statistical Inference in Finite Samples
- IV. Asymptotic Theory and Instrumental Variables Estimation
- V. Nonlinear Regression Models [11.1-11.4] < [9.1-9.4]>
- VI. The Generalized Regression Model
- VII. Methods of Estimation
- FINAL
FONT-SIZE: 12pt; FONT-FAMILY: 'Times New Roman'; mso-font-kerning: 0pt">
- 2. Autocorrelation [19.8-19.9] < [13.1 to 13.7]>
- VII. Methods of Estimation
- A. Instrumental Variables Estimation [Chapter 12] < [5.4]>
- 1. Measurement error [12.5] < [5.4]>
- 2. Lagged dependent variables and autocorrelation [19.9.3] < [12.9.4]>
- 3. Two stage least squares [12.3] < (no reading)>
- B. Maximum likelihood estimation [Chapter 16] < [chapter 17, 11.7, 11.8]>
- 1. Computation [E.2, E.3] < [e.4 and e.5 (both optional)]>
- 2. Covariance matrix estimation [16.4.6] < [17.4.6]>
- 3. GARCH models [19.13] < [11.8]>
- 4. Likelihood ratio, Lagrange multiplier tests [16.6, 8.5.2, 16.9.2, 9.5.3] < [17.5, 11.4.3, 11.6.3, 13.4.3]>
- C. Generalized method of moments (GMM) estimation [Chapter 15] < [chapter 18]>
- A. Instrumental Variables Estimation [Chapter 12] < [5.4]>
- <> FINAL
so-list: l1 level3 lfo3; mso-margin-top-alt: auto; mso-margin-bottom-alt: auto; 08: ">1. Heteroscedasticity [8.8] < [11.7]>2. Autocorrelation [19.8-19.9] < [13.1 to 13.7]>VII. Methods of EstimationA. Instrumental Variables Estimation [Chapter 12] < [5.4]>1. Measurement error [12.5] < [5.4]>2. Lagged dependent variables and autocorrelation [19.9.3] < [12.9.4]>3. Two stage least squares [12.3] < (no reading)>B. Maximum likelihood estimation [Chapter 16] < [chapter 17, 11.7, 11.8]>1. Computation [E.2, E.3] < [e.4 and e.5 (both optional)]>2. Covariance matrix estimation [16.4.6] < [17.4.6]>3. GARCH models [19.13] < [11.8]>4. Likelihood ratio, Lagrange multiplier tests [16.6, 8.5.2, 16.9.2, 9.5.3] < [17.5, 11.4.3, 11.6.3, 13.4.3]>C. Generalized method of moments (GMM) estimation [Chapter 15] < [chapter 18]> FINAL2. Autocorrelation [19.8-19.9] < [13.1 to 13.7]>A. Instrumental Variables Estimation [Chapter 12] < [5.4]>- 1. Measurement error [12.5] < [5.4]>
- 2. Lagged dependent variables and autocorrelation [19.9.3] < [12.9.4]>
- 3. Two stage least squares [12.3] < (no reading)>
- B. Maximum likelihood estimation [Chapter 16] < [chapter 17, 11.7, 11.8]>
- 1. Computation [E.2, E.3] < [e.4 and e.5 (both optional)]>
- 2. Covariance matrix estimation [16.4.6] < [17.4.6]>
- 3. GARCH models [19.13] < [11.8]>
- 4. Likelihood ratio, Lagrange multiplier tests [16.6, 8.5.2, 16.9.2, 9.5.3] < [17.5, 11.4.3, 11.6.3, 13.4.3]>
- C. Generalized method of moments (GMM) estimation [Chapter 15] < [chapter 18]>