Deep learning: RNNs and LSTM

Robert DiPietro, Gregory D. Hager

Research output: Chapter in Book/Report/Conference proceedingChapter

11 Scopus citations

Abstract

Recurrent neural networks (RNNs) are a class of neural networks that are naturally suited to processing time-series data and other sequential data. Here we introduce recurrent neural networks as an extension to feedforward networks, in order to allow the processing of variable-length (or even infinite-length) sequences, and some of the most popular recurrent architectures in use, including long short-term memory (LSTM) and gated recurrent units (GRUs). In addition, various aspects surrounding RNNs are discussed in detail, including various probabilistic models that are often realized using RNNs and various applications of RNNs that have appeared within the MICCAI community.

Original languageEnglish (US)
Title of host publicationHandbook of Medical Image Computing and Computer Assisted Intervention
PublisherElsevier
Pages503-519
Number of pages17
ISBN (Electronic)9780128161760
DOIs
StatePublished - Jan 1 2019

Keywords

  • Long short-term memory
  • Recurrent neural networks

ASJC Scopus subject areas

  • General Computer Science

Fingerprint

Dive into the research topics of 'Deep learning: RNNs and LSTM'. Together they form a unique fingerprint.

Cite this