Semiparametric partial common principal component analysis for covariance matrices

Bingkai Wang, Xi Luo, Yi Zhao, Brian Caffo

Research output: Contribution to journalArticlepeer-review

Abstract

We consider the problem of jointly modeling multiple covariance matrices by partial common principal component analysis (PCPCA), which assumes a proportion of eigenvectors to be shared across covariance matrices and the rest to be individual-specific. This paper proposes consistent estimators of the shared eigenvectors in the PCPCA as the number of matrices or the number of samples to estimate each matrix goes to infinity. We prove such asymptotic results without making any assumptions on the ranks of eigenvalues that are associated with the shared eigenvectors. When the number of samples goes to infinity, our results do not require the data to be Gaussian distributed. Furthermore, this paper introduces a sequential testing procedure to identify the number of shared eigenvectors in the PCPCA. In simulation studies, our method shows higher accuracy in estimating the shared eigenvectors than competing methods. Applied to a motor-task functional magnetic resonance imaging data set, our estimator identifies meaningful brain networks that are consistent with current scientific understandings of motor networks during a motor paradigm.

Original languageEnglish (US)
Pages (from-to)1175-1186
Number of pages12
JournalBiometrics
Volume77
Issue number4
DOIs
StatePublished - Dec 2021

ASJC Scopus subject areas

  • General Agricultural and Biological Sciences
  • Applied Mathematics
  • General Biochemistry, Genetics and Molecular Biology
  • General Immunology and Microbiology
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Semiparametric partial common principal component analysis for covariance matrices'. Together they form a unique fingerprint.

Cite this