Automatic joint classification and segmentation of whole cell 3D images

Rajesh Narasimha, Hua Ouyang, Alexander Gray, Steven W. McLaughlin, Sriram Subramaniam

Research output: Contribution to journalArticle

Abstract

We present a machine learning tool for automatic texton-based joint classification and segmentation of mitochondria in MNT-1 cells imaged using ion-abrasion scanning electron microscopy (IA-SEM). For diagnosing signatures that may be unique to cellular states such as cancer, automatic tools with minimal user intervention need to be developed for analysis and mining of high-throughput data from these large volume data sets (typically ∼ 2 GB / cell). Challenges for such a tool in 3D electron microscopy arise due to low contrast and signal-to-noise ratios (SNR) inherent to biological imaging. Our approach is based on block-wise classification of images into a trained list of regions. Given manually labeled images, our goal is to learn models that can localize novel instances of the regions in test datasets. Since datasets obtained using electron microscopes are intrinsically noisy, we improve the SNR of the data for automatic segmentation by implementing a 2D texture-preserving filter on each slice of the 3D dataset. We investigate texton-based region features in this work. Classification is performed by k-nearest neighbor (k-NN) classifier, support vector machines (SVMs), adaptive boosting (AdaBoost) and histogram matching using a NN classifier. In addition, we study the computational complexity vs. segmentation accuracy tradeoff of these classifiers. Segmentation results demonstrate that our approach using minimal training data performs close to semi-automatic methods using the variational level-set method and manual segmentation carried out by an experienced user. Using our method, which we show to have minimal user intervention and high classification accuracy, we investigate quantitative parameters such as volume of the cytoplasm occupied by mitochondria, differences between the surface area of inner and outer membranes and mean mitochondrial width which are quantities potentially relevant to distinguishing cancer cells from normal cells. To test the accuracy of our approach, these quantities are compared against manually computed counterparts. We also demonstrate extension of these methods to segment 3D images obtained using electron tomography.

Original languageEnglish (US)
Pages (from-to)1067-1079
Number of pages13
JournalPattern Recognition
Volume42
Issue number6
DOIs
StatePublished - Jun 2009
Externally publishedYes

Keywords

  • Automated techniques
  • Cancer detection
  • Classification
  • Machine learning
  • Mitochondria
  • Segmentation
  • Texture features

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Signal Processing

Fingerprint Dive into the research topics of 'Automatic joint classification and segmentation of whole cell 3D images'. Together they form a unique fingerprint.

  • Cite this

    Narasimha, R., Ouyang, H., Gray, A., McLaughlin, S. W., & Subramaniam, S. (2009). Automatic joint classification and segmentation of whole cell 3D images. Pattern Recognition, 42(6), 1067-1079. https://doi.org/10.1016/j.patcog.2008.08.009