Automated segmentation of white matter lesions in 3D brain MR images, using multivariate pattern classification

Zhiqiang Lao, Dinggang Shen, Abbas Jawad, Bilge Karacali, Dengfeng Liu, Elias R. Melhem, R. Nick Bryan, Christos Davatzikos

Research output: Chapter in Book/Report/Conference proceedingConference contribution

24 Scopus citations

Abstract

This paper presents a fully automatic white matter lesion (WML) segmentation method, based on local features determined by combining multiple MR acquisition protocols, including T1-weighted, T2-weighted, proton density (PD)-weighted and fluid attenuation inversion recovery (FLAIR) scans. Support vector machines (SVMs) are used to integrate features from these 4 acquisition types, thereby identifying nonlinear imaging profiles that distinguish and classify WMLs from normal brain tissue. Validation on a population of 45 diabetes patients with diverse spatial and size distribution of WMLs shows the robustness and accuracy of the proposed segmentation method, compared to the manual segmentation results from two experienced neuroradiologists.

Original languageEnglish (US)
Title of host publication2006 3rd IEEE International Symposium on Biomedical Imaging
Subtitle of host publicationFrom Nano to Macro - Proceedings
Pages307-310
Number of pages4
StatePublished - 2006
Externally publishedYes
Event2006 3rd IEEE International Symposium on Biomedical Imaging: From Nano to Macro - Arlington, VA, United States
Duration: Apr 6 2006Apr 9 2006

Publication series

Name2006 3rd IEEE International Symposium on Biomedical Imaging: From Nano to Macro - Proceedings
Volume2006

Other

Other2006 3rd IEEE International Symposium on Biomedical Imaging: From Nano to Macro
Country/TerritoryUnited States
CityArlington, VA
Period4/6/064/9/06

ASJC Scopus subject areas

  • General Engineering

Fingerprint

Dive into the research topics of 'Automated segmentation of white matter lesions in 3D brain MR images, using multivariate pattern classification'. Together they form a unique fingerprint.

Cite this