A robust sparse-modeling framework for estimating schizophrenia biomarkers from fMRI

Keith Dillon, Vince Calhoun, Yu Ping Wang

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Background Our goal is to identify the brain regions most relevant to mental illness using neuroimaging. State of the art machine learning methods commonly suffer from repeatability difficulties in this application, particularly when using large and heterogeneous populations for samples. New method We revisit both dimensionality reduction and sparse modeling, and recast them in a common optimization-based framework. This allows us to combine the benefits of both types of methods in an approach which we call unambiguous components. We use this to estimate the image component with a constrained variability, which is best correlated with the unknown disease mechanism. Results We apply the method to the estimation of neuroimaging biomarkers for schizophrenia, using task fMRI data from a large multi-site study. The proposed approach yields an improvement in both robustness of the estimate and classification accuracy. Comparison with existing methods We find that unambiguous components incorporate roughly two thirds of the same brain regions as sparsity-based methods LASSO and elastic net, while roughly one third of the selected regions differ. Further, unambiguous components achieve superior classification accuracy in differentiating cases from controls. Conclusions Unambiguous components provide a robust way to estimate important regions of imaging data.

Original languageEnglish (US)
Pages (from-to)46-55
Number of pages10
JournalJournal of Neuroscience Methods
Volume276
DOIs
StatePublished - Jan 30 2017

Keywords

  • Functional MRI
  • Optimization
  • PCA
  • Schizophrenia
  • Sparsity

ASJC Scopus subject areas

  • General Neuroscience

Fingerprint

Dive into the research topics of 'A robust sparse-modeling framework for estimating schizophrenia biomarkers from fMRI'. Together they form a unique fingerprint.

Cite this