Profile-kernel likelihood inference with diverging number of parameters

Clifford Lam, Jianqing Fan

Research output: Contribution to journalArticlepeer-review

88 Scopus citations

Abstract

The generalized varying coefficient partially linear model with a growing number of predictors arises in many contemporary scientific endeavor. In this paper we set foot on both theoretical and practical sides of profile likelihood estimation and inference. When the number of parameters grows with sample size, the existence and asymptotic normality of the profile likelihood estimator are established under some regularity conditions. Profile likelihood ratio inference for the growing number of parameters is proposed and Wilk's phenomenon is demonstrated. A new algorithm, called the accelerated profile-kernel algorithm, for computing profile-kernel estimator is proposed and investigated. Simulation studies show that the resulting estimates are as efficient as the fully iterative profile-kernel estimates. For moderate sample sizes, our proposed procedure saves much computational time over the fully iterative profile-kernel one and gives stabler estimates. A set of real data is analyzed using our proposed algorithm.

Original languageEnglish (US)
Pages (from-to)2232-2260
Number of pages29
JournalAnnals of Statistics
Volume36
Issue number5
DOIs
StatePublished - Oct 2008

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Keywords

  • Asymptotic normality
  • Generalized likelihood ratio tests
  • Generalized linear models
  • High dimensionality
  • Profile likelihood
  • Varying coefficients

Fingerprint

Dive into the research topics of 'Profile-kernel likelihood inference with diverging number of parameters'. Together they form a unique fingerprint.

Cite this