Bio-Swarm-Pipeline: A light-weight, extensible batch processing system for efficient biomedical data processing

Xi Cheng, Ricardo Pizarro, Yunxia Tong, Brad Zoltick, Qian Luo, Daniel R. Weinberger, Venkata S. Mattay

Research output: Contribution to journalArticle

Abstract

A streamlined scientific workflow system that can track the details of the data processing history is critical for the efficient handling of fundamental routines used in scientific research. In the scientific workflow research community, the information that describes the details of data processing history is referred to as "provenance" which plays an important role in most of the existing workflow management systems. Despite its importance, however, provenance modeling and management is still a relatively new area in the scientific workflow research community. The proper scope, representation, granularity and implementation of a provenance model can vary from domain to domain and pose a number of challenges for an efficient pipeline design. This paper provides a case study on structured provenance modeling and management problems in the neuroimaging domain by introducing the Bio-Swarm-Pipeline. This new model, which is evaluated in the paper through real world scenarios, systematically addresses the provenance scope, representation, granularity, and implementation issues related to the neuroimaging domain. Although this model stems from applications in neuroimaging, the system can potentially be adapted to a wide range of bio-medical application scenarios.

Original languageEnglish (US)
Article number35
JournalFrontiers in Neuroinformatics
Volume3
Issue numberOCT
DOIs
StatePublished - Oct 9 2009
Externally publishedYes

    Fingerprint

Keywords

  • Neuroimaging
  • Neuroinformatics
  • Provenance
  • Scientific workflow
  • Swarm

ASJC Scopus subject areas

  • Neuroscience (miscellaneous)
  • Biomedical Engineering
  • Computer Science Applications

Cite this