Whole MILC: Generalizing learned dynamics across tasks, datasets, and populations

Usman Mahmood, Md Mahfuzur Rahman, Alex Fedorov, Noah Lewis, Zening Fu, Vince D. Calhoun, Sergey M. Plis

Research output: Contribution to journalArticlepeer-review

Abstract

Behavioral changes are the earliest signs of a mental disorder, but arguably, the dynamics of brain function gets affected even earlier. Subsequently, spatio-temporal structure of disorder-specific dynamics is crucial for early diagnosis and understanding the disorder mechanism. A common way of learning discriminatory features relies on training a classifier and evaluating feature importance. Classical classifiers, based on handcrafted features are quite powerful, but suffer the curse of dimensionality when applied to large input dimensions of spatio-temporal data. Deep learning algorithms could handle the problem and a model introspection could highlight discriminatory spatio-temporal regions but need way more samples to train. In this paper we present a novel self supervised training schema which reinforces whole sequence mutual information local to context (whole MILC). We pre-train the whole MILC model on unlabeled and unrelated healthy control data. We test our model on three different disorders (i) Schizophrenia (ii) Autism and (iii) Alzheimers and four different studies. Our algorithm outperforms existing self-supervised pre-training methods and provides competitive classification results to classical machine learning algorithms. Importantly, whole MILC enables attribution of subject diagnosis to specific spatio-temporal regions in the fMRI signal.

Original languageEnglish (US)
JournalUnknown Journal
StatePublished - Jul 29 2020

Keywords

  • Deep Learning
  • Resting State fMRI
  • Self-Supervised
  • Transfer Learning

ASJC Scopus subject areas

  • General

Fingerprint

Dive into the research topics of 'Whole MILC: Generalizing learned dynamics across tasks, datasets, and populations'. Together they form a unique fingerprint.

Cite this