Detection of relationships among multi-modal brain imaging meta-features via information flow

Robyn L. Miller, Victor M. Vergara, Vince D. Calhoun

Research output: Research - peer-reviewArticle

Abstract

Background Neuroscientists and clinical researchers are awash in data from an ever-growing number of imaging and other bio-behavioral modalities. This flow of brain imaging data, taken under resting and various task conditions, combines with available cognitive measures, behavioral information, genetic data plus other potentially salient biomedical and environmental information to create a rich but diffuse data landscape. The conditions being studied with brain imaging data are often extremely complex and it is common for researchers to employ more than one imaging, behavioral or biological data modality (e.g., genetics) in their investigations. While the field has advanced significantly in its approach to multimodal data, the vast majority of studies still ignore joint information among two or more features or modalities. New method We propose an intuitive framework based on conditional probabilities for understanding information exchange between features in what we are calling a feature meta-space; that is, a space consisting of many individual featurae spaces. Features can have any dimension and can be drawn from any data source or modality. No a priori assumptions are made about the functional form (e.g., linear, polynomial, exponential) of captured inter-feature relationships. Results We demonstrate the framework's ability to identify relationships between disparate features of varying dimensionality by applying it to a large multi-site, multi-modal clinical dataset, balance between schizophrenia patients and controls. In our application it exposes both expected (previously observed) relationships, and novel relationships rarely considered investigated by clinical researchers. Comparison with existing method(s) To the best of our knowledge there is not presently a comparably efficient way to capture relationships of indeterminate functional form between features of arbitrary dimension and type. We are introducing this method as an initial foray into a space that remains relatively underpopulated. Conclusions The framework we propose is powerful, intuitive and very efficiently provides a high-level overview of a massive data space. In our application it exposes both expected relationships and relationships very rarely considered worth investigating by clinical researchers.

LanguageEnglish (US)
Pages72-80
Number of pages9
JournalJournal of Neuroscience Methods
Volume294
DOIs
StatePublished - Jan 15 2018
Externally publishedYes

Fingerprint

Neuroimaging
Research Personnel
Behavioral Genetics
Aptitude
Information Storage and Retrieval
Schizophrenia
Joints
Datasets

Keywords

  • Big data
  • Biomedical imaging
  • Biomedical signal processing
  • Data fusion
  • Multimodal brain imaging

ASJC Scopus subject areas

  • Neuroscience(all)

Cite this

Detection of relationships among multi-modal brain imaging meta-features via information flow. / Miller, Robyn L.; Vergara, Victor M.; Calhoun, Vince D.

In: Journal of Neuroscience Methods, Vol. 294, 15.01.2018, p. 72-80.

Research output: Research - peer-reviewArticle

@article{066c72daae0648368fe0e580fc23c121,
title = "Detection of relationships among multi-modal brain imaging meta-features via information flow",
abstract = "Background Neuroscientists and clinical researchers are awash in data from an ever-growing number of imaging and other bio-behavioral modalities. This flow of brain imaging data, taken under resting and various task conditions, combines with available cognitive measures, behavioral information, genetic data plus other potentially salient biomedical and environmental information to create a rich but diffuse data landscape. The conditions being studied with brain imaging data are often extremely complex and it is common for researchers to employ more than one imaging, behavioral or biological data modality (e.g., genetics) in their investigations. While the field has advanced significantly in its approach to multimodal data, the vast majority of studies still ignore joint information among two or more features or modalities. New method We propose an intuitive framework based on conditional probabilities for understanding information exchange between features in what we are calling a feature meta-space; that is, a space consisting of many individual featurae spaces. Features can have any dimension and can be drawn from any data source or modality. No a priori assumptions are made about the functional form (e.g., linear, polynomial, exponential) of captured inter-feature relationships. Results We demonstrate the framework's ability to identify relationships between disparate features of varying dimensionality by applying it to a large multi-site, multi-modal clinical dataset, balance between schizophrenia patients and controls. In our application it exposes both expected (previously observed) relationships, and novel relationships rarely considered investigated by clinical researchers. Comparison with existing method(s) To the best of our knowledge there is not presently a comparably efficient way to capture relationships of indeterminate functional form between features of arbitrary dimension and type. We are introducing this method as an initial foray into a space that remains relatively underpopulated. Conclusions The framework we propose is powerful, intuitive and very efficiently provides a high-level overview of a massive data space. In our application it exposes both expected relationships and relationships very rarely considered worth investigating by clinical researchers.",
keywords = "Big data, Biomedical imaging, Biomedical signal processing, Data fusion, Multimodal brain imaging",
author = "Miller, {Robyn L.} and Vergara, {Victor M.} and Calhoun, {Vince D.}",
year = "2018",
month = "1",
doi = "10.1016/j.jneumeth.2017.11.006",
volume = "294",
pages = "72--80",
journal = "Journal of Neuroscience Methods",
issn = "0165-0270",
publisher = "Elsevier",

}

TY - JOUR

T1 - Detection of relationships among multi-modal brain imaging meta-features via information flow

AU - Miller,Robyn L.

AU - Vergara,Victor M.

AU - Calhoun,Vince D.

PY - 2018/1/15

Y1 - 2018/1/15

N2 - Background Neuroscientists and clinical researchers are awash in data from an ever-growing number of imaging and other bio-behavioral modalities. This flow of brain imaging data, taken under resting and various task conditions, combines with available cognitive measures, behavioral information, genetic data plus other potentially salient biomedical and environmental information to create a rich but diffuse data landscape. The conditions being studied with brain imaging data are often extremely complex and it is common for researchers to employ more than one imaging, behavioral or biological data modality (e.g., genetics) in their investigations. While the field has advanced significantly in its approach to multimodal data, the vast majority of studies still ignore joint information among two or more features or modalities. New method We propose an intuitive framework based on conditional probabilities for understanding information exchange between features in what we are calling a feature meta-space; that is, a space consisting of many individual featurae spaces. Features can have any dimension and can be drawn from any data source or modality. No a priori assumptions are made about the functional form (e.g., linear, polynomial, exponential) of captured inter-feature relationships. Results We demonstrate the framework's ability to identify relationships between disparate features of varying dimensionality by applying it to a large multi-site, multi-modal clinical dataset, balance between schizophrenia patients and controls. In our application it exposes both expected (previously observed) relationships, and novel relationships rarely considered investigated by clinical researchers. Comparison with existing method(s) To the best of our knowledge there is not presently a comparably efficient way to capture relationships of indeterminate functional form between features of arbitrary dimension and type. We are introducing this method as an initial foray into a space that remains relatively underpopulated. Conclusions The framework we propose is powerful, intuitive and very efficiently provides a high-level overview of a massive data space. In our application it exposes both expected relationships and relationships very rarely considered worth investigating by clinical researchers.

AB - Background Neuroscientists and clinical researchers are awash in data from an ever-growing number of imaging and other bio-behavioral modalities. This flow of brain imaging data, taken under resting and various task conditions, combines with available cognitive measures, behavioral information, genetic data plus other potentially salient biomedical and environmental information to create a rich but diffuse data landscape. The conditions being studied with brain imaging data are often extremely complex and it is common for researchers to employ more than one imaging, behavioral or biological data modality (e.g., genetics) in their investigations. While the field has advanced significantly in its approach to multimodal data, the vast majority of studies still ignore joint information among two or more features or modalities. New method We propose an intuitive framework based on conditional probabilities for understanding information exchange between features in what we are calling a feature meta-space; that is, a space consisting of many individual featurae spaces. Features can have any dimension and can be drawn from any data source or modality. No a priori assumptions are made about the functional form (e.g., linear, polynomial, exponential) of captured inter-feature relationships. Results We demonstrate the framework's ability to identify relationships between disparate features of varying dimensionality by applying it to a large multi-site, multi-modal clinical dataset, balance between schizophrenia patients and controls. In our application it exposes both expected (previously observed) relationships, and novel relationships rarely considered investigated by clinical researchers. Comparison with existing method(s) To the best of our knowledge there is not presently a comparably efficient way to capture relationships of indeterminate functional form between features of arbitrary dimension and type. We are introducing this method as an initial foray into a space that remains relatively underpopulated. Conclusions The framework we propose is powerful, intuitive and very efficiently provides a high-level overview of a massive data space. In our application it exposes both expected relationships and relationships very rarely considered worth investigating by clinical researchers.

KW - Big data

KW - Biomedical imaging

KW - Biomedical signal processing

KW - Data fusion

KW - Multimodal brain imaging

UR - http://www.scopus.com/inward/record.url?scp=85034432894&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85034432894&partnerID=8YFLogxK

U2 - 10.1016/j.jneumeth.2017.11.006

DO - 10.1016/j.jneumeth.2017.11.006

M3 - Article

VL - 294

SP - 72

EP - 80

JO - Journal of Neuroscience Methods

T2 - Journal of Neuroscience Methods

JF - Journal of Neuroscience Methods

SN - 0165-0270

ER -