## Abstract

How to design deep neural networks (DNNs) for the representation and analysis of high dimensional but small sample size data is still a big challenge. One solution is to construct a sparse network. At present, there exist many approaches to achieve sparsity for DNNs by regularization, but most of them are carried out only in the pre-training process due to the difficulty in the derivation of explicit formulae in the fine-tuning process. In this paper, a log-sum function is used as the regularization terms for both the responses of hidden neurons and the network connections in the loss function of the fine-tuning process. It provides a better approximation to the L_{0}-norm than several often used norms. Based on the gradient formula of the loss function, the fine-tuning process can be executed more efficiently. Specifically, the commonly used gradient calculation in many deep learning research platforms, such as PyTorch or TensorFlow, can be accelerated. Given the analytic formula for calculating gradients used in any layer of DNN, the error accumulated from successive numerical approximations in the differentiation process can be avoided. With the proposed log-sum enhanced sparse deep neural network (LSES-DNN), the sparsity of the responses and the connections can be well controlled to improve the adaptivity of DNNs. The proposed model is applied to MRI data for both the diagnosis of schizophrenia and the study of brain developments. Numerical experiments demonstrate its superior performance among several classical classifiers tested.

Original language | English (US) |
---|---|

Pages (from-to) | 206-220 |

Number of pages | 15 |

Journal | Neurocomputing |

Volume | 407 |

DOIs | |

State | Published - Sep 24 2020 |

Externally published | Yes |

## Keywords

- Back propagation algorithm
- Concise gradient formula
- Deep neural network
- Log-sum enhanced sparsity
- Magnetic resonance imaging

## ASJC Scopus subject areas

- Computer Science Applications
- Cognitive Neuroscience
- Artificial Intelligence