### Abstract

Nonnegative Least Squares (NNLS) is a general form for many important problems. We consider a special case of NNLS where the input is nonnegative. It is called Totally Nonnegative Least Squares (TNNLS) in the literature. We show a reduction of TNNLS to a single class Support Vector Machine (SVM), thus relating the sparsity of a TNNLS solution to the sparsity of supports in a SVM. This allows us to apply any SVM solver to the TNNLS problem. We get an order of magnitude improvement in running time by first obtaining a smaller version of our original problem with the same solution using a fast approximate SVM solver. Second, we use an exact NNLS solver to obtain the solution. We present experimental evidence that this approach improves the performance of state-of-the-art NNLS solvers by applying it to both randomly generated problems as well as to real datasets, calculating radiation therapy dosages for cancer patients.

Original language | English (US) |
---|---|

Title of host publication | Proceedings of the International Joint Conference on Neural Networks |

Pages | 1922-1929 |

Number of pages | 8 |

DOIs | |

State | Published - 2011 |

Externally published | Yes |

Event | 2011 International Joint Conference on Neural Network, IJCNN 2011 - San Jose, CA, United States Duration: Jul 31 2011 → Aug 5 2011 |

### Other

Other | 2011 International Joint Conference on Neural Network, IJCNN 2011 |
---|---|

Country | United States |

City | San Jose, CA |

Period | 7/31/11 → 8/5/11 |

### Fingerprint

### ASJC Scopus subject areas

- Software
- Artificial Intelligence

### Cite this

*Proceedings of the International Joint Conference on Neural Networks*(pp. 1922-1929). [6033459] https://doi.org/10.1109/IJCNN.2011.6033459

**Sparseness and a reduction from Totally Nonnegative Least Squares to SVM.** / Potluru, Vamsi K.; Plis, Sergey M.; Luan, Shuang; Calhoun, Vince Daniel; Hayes, Thomas P.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*Proceedings of the International Joint Conference on Neural Networks.*, 6033459, pp. 1922-1929, 2011 International Joint Conference on Neural Network, IJCNN 2011, San Jose, CA, United States, 7/31/11. https://doi.org/10.1109/IJCNN.2011.6033459

}

TY - GEN

T1 - Sparseness and a reduction from Totally Nonnegative Least Squares to SVM

AU - Potluru, Vamsi K.

AU - Plis, Sergey M.

AU - Luan, Shuang

AU - Calhoun, Vince Daniel

AU - Hayes, Thomas P.

PY - 2011

Y1 - 2011

N2 - Nonnegative Least Squares (NNLS) is a general form for many important problems. We consider a special case of NNLS where the input is nonnegative. It is called Totally Nonnegative Least Squares (TNNLS) in the literature. We show a reduction of TNNLS to a single class Support Vector Machine (SVM), thus relating the sparsity of a TNNLS solution to the sparsity of supports in a SVM. This allows us to apply any SVM solver to the TNNLS problem. We get an order of magnitude improvement in running time by first obtaining a smaller version of our original problem with the same solution using a fast approximate SVM solver. Second, we use an exact NNLS solver to obtain the solution. We present experimental evidence that this approach improves the performance of state-of-the-art NNLS solvers by applying it to both randomly generated problems as well as to real datasets, calculating radiation therapy dosages for cancer patients.

AB - Nonnegative Least Squares (NNLS) is a general form for many important problems. We consider a special case of NNLS where the input is nonnegative. It is called Totally Nonnegative Least Squares (TNNLS) in the literature. We show a reduction of TNNLS to a single class Support Vector Machine (SVM), thus relating the sparsity of a TNNLS solution to the sparsity of supports in a SVM. This allows us to apply any SVM solver to the TNNLS problem. We get an order of magnitude improvement in running time by first obtaining a smaller version of our original problem with the same solution using a fast approximate SVM solver. Second, we use an exact NNLS solver to obtain the solution. We present experimental evidence that this approach improves the performance of state-of-the-art NNLS solvers by applying it to both randomly generated problems as well as to real datasets, calculating radiation therapy dosages for cancer patients.

UR - http://www.scopus.com/inward/record.url?scp=80054750138&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=80054750138&partnerID=8YFLogxK

U2 - 10.1109/IJCNN.2011.6033459

DO - 10.1109/IJCNN.2011.6033459

M3 - Conference contribution

AN - SCOPUS:80054750138

SN - 9781457710865

SP - 1922

EP - 1929

BT - Proceedings of the International Joint Conference on Neural Networks

ER -