Sparseness and a reduction from Totally Nonnegative Least Squares to SVM

Vamsi K. Potluru, Sergey M. Plis, Shuang Luan, Vince Daniel Calhoun, Thomas P. Hayes

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Nonnegative Least Squares (NNLS) is a general form for many important problems. We consider a special case of NNLS where the input is nonnegative. It is called Totally Nonnegative Least Squares (TNNLS) in the literature. We show a reduction of TNNLS to a single class Support Vector Machine (SVM), thus relating the sparsity of a TNNLS solution to the sparsity of supports in a SVM. This allows us to apply any SVM solver to the TNNLS problem. We get an order of magnitude improvement in running time by first obtaining a smaller version of our original problem with the same solution using a fast approximate SVM solver. Second, we use an exact NNLS solver to obtain the solution. We present experimental evidence that this approach improves the performance of state-of-the-art NNLS solvers by applying it to both randomly generated problems as well as to real datasets, calculating radiation therapy dosages for cancer patients.

Original languageEnglish (US)
Title of host publicationProceedings of the International Joint Conference on Neural Networks
Pages1922-1929
Number of pages8
DOIs
StatePublished - 2011
Externally publishedYes
Event2011 International Joint Conference on Neural Network, IJCNN 2011 - San Jose, CA, United States
Duration: Jul 31 2011Aug 5 2011

Other

Other2011 International Joint Conference on Neural Network, IJCNN 2011
CountryUnited States
CitySan Jose, CA
Period7/31/118/5/11

Fingerprint

Support vector machines
Radiotherapy

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Potluru, V. K., Plis, S. M., Luan, S., Calhoun, V. D., & Hayes, T. P. (2011). Sparseness and a reduction from Totally Nonnegative Least Squares to SVM. In Proceedings of the International Joint Conference on Neural Networks (pp. 1922-1929). [6033459] https://doi.org/10.1109/IJCNN.2011.6033459

Sparseness and a reduction from Totally Nonnegative Least Squares to SVM. / Potluru, Vamsi K.; Plis, Sergey M.; Luan, Shuang; Calhoun, Vince Daniel; Hayes, Thomas P.

Proceedings of the International Joint Conference on Neural Networks. 2011. p. 1922-1929 6033459.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Potluru, VK, Plis, SM, Luan, S, Calhoun, VD & Hayes, TP 2011, Sparseness and a reduction from Totally Nonnegative Least Squares to SVM. in Proceedings of the International Joint Conference on Neural Networks., 6033459, pp. 1922-1929, 2011 International Joint Conference on Neural Network, IJCNN 2011, San Jose, CA, United States, 7/31/11. https://doi.org/10.1109/IJCNN.2011.6033459
Potluru VK, Plis SM, Luan S, Calhoun VD, Hayes TP. Sparseness and a reduction from Totally Nonnegative Least Squares to SVM. In Proceedings of the International Joint Conference on Neural Networks. 2011. p. 1922-1929. 6033459 https://doi.org/10.1109/IJCNN.2011.6033459
Potluru, Vamsi K. ; Plis, Sergey M. ; Luan, Shuang ; Calhoun, Vince Daniel ; Hayes, Thomas P. / Sparseness and a reduction from Totally Nonnegative Least Squares to SVM. Proceedings of the International Joint Conference on Neural Networks. 2011. pp. 1922-1929
@inproceedings{765685035470438cb406d9dc387969dc,
title = "Sparseness and a reduction from Totally Nonnegative Least Squares to SVM",
abstract = "Nonnegative Least Squares (NNLS) is a general form for many important problems. We consider a special case of NNLS where the input is nonnegative. It is called Totally Nonnegative Least Squares (TNNLS) in the literature. We show a reduction of TNNLS to a single class Support Vector Machine (SVM), thus relating the sparsity of a TNNLS solution to the sparsity of supports in a SVM. This allows us to apply any SVM solver to the TNNLS problem. We get an order of magnitude improvement in running time by first obtaining a smaller version of our original problem with the same solution using a fast approximate SVM solver. Second, we use an exact NNLS solver to obtain the solution. We present experimental evidence that this approach improves the performance of state-of-the-art NNLS solvers by applying it to both randomly generated problems as well as to real datasets, calculating radiation therapy dosages for cancer patients.",
author = "Potluru, {Vamsi K.} and Plis, {Sergey M.} and Shuang Luan and Calhoun, {Vince Daniel} and Hayes, {Thomas P.}",
year = "2011",
doi = "10.1109/IJCNN.2011.6033459",
language = "English (US)",
isbn = "9781457710865",
pages = "1922--1929",
booktitle = "Proceedings of the International Joint Conference on Neural Networks",

}

TY - GEN

T1 - Sparseness and a reduction from Totally Nonnegative Least Squares to SVM

AU - Potluru, Vamsi K.

AU - Plis, Sergey M.

AU - Luan, Shuang

AU - Calhoun, Vince Daniel

AU - Hayes, Thomas P.

PY - 2011

Y1 - 2011

N2 - Nonnegative Least Squares (NNLS) is a general form for many important problems. We consider a special case of NNLS where the input is nonnegative. It is called Totally Nonnegative Least Squares (TNNLS) in the literature. We show a reduction of TNNLS to a single class Support Vector Machine (SVM), thus relating the sparsity of a TNNLS solution to the sparsity of supports in a SVM. This allows us to apply any SVM solver to the TNNLS problem. We get an order of magnitude improvement in running time by first obtaining a smaller version of our original problem with the same solution using a fast approximate SVM solver. Second, we use an exact NNLS solver to obtain the solution. We present experimental evidence that this approach improves the performance of state-of-the-art NNLS solvers by applying it to both randomly generated problems as well as to real datasets, calculating radiation therapy dosages for cancer patients.

AB - Nonnegative Least Squares (NNLS) is a general form for many important problems. We consider a special case of NNLS where the input is nonnegative. It is called Totally Nonnegative Least Squares (TNNLS) in the literature. We show a reduction of TNNLS to a single class Support Vector Machine (SVM), thus relating the sparsity of a TNNLS solution to the sparsity of supports in a SVM. This allows us to apply any SVM solver to the TNNLS problem. We get an order of magnitude improvement in running time by first obtaining a smaller version of our original problem with the same solution using a fast approximate SVM solver. Second, we use an exact NNLS solver to obtain the solution. We present experimental evidence that this approach improves the performance of state-of-the-art NNLS solvers by applying it to both randomly generated problems as well as to real datasets, calculating radiation therapy dosages for cancer patients.

UR - http://www.scopus.com/inward/record.url?scp=80054750138&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=80054750138&partnerID=8YFLogxK

U2 - 10.1109/IJCNN.2011.6033459

DO - 10.1109/IJCNN.2011.6033459

M3 - Conference contribution

AN - SCOPUS:80054750138

SN - 9781457710865

SP - 1922

EP - 1929

BT - Proceedings of the International Joint Conference on Neural Networks

ER -