## Abstract

The Lasso shrinkage procedure achieved its popularity, in part, by its tendency to shrink estimated coefficients to zero, and its ability to serve as a variable selection procedure. Using data-adaptive weights, the adaptive Lasso modified the original procedure to increase the penalty terms for those variables estimated to be less important by ordinary least squares. Although this modified procedure attained the oracle properties, the resulting models tend to include a large number of "false positives" in practice. Here, we adapt the concept of local false discovery rates (lFDRs) so that it applies to the sequence, λ_{n}, of smoothing parameters for the adaptive Lasso.We define the lFDR for a given λ_{n} to be the probability that the variable added to the model by decreasing λ_{n} to λ_{n} - δ is not associated with the outcome, where δ is a small value. We derive the relationship between the lFDR and λ_{n}, show lFDR =1 for traditional smoothing parameters, and show how to select λ_{n} so as to achieve a desired lFDR.We compare the smoothing parameters chosen to achieve a specified lFDR and those chosen to achieve the oracle properties, as well as their resulting estimates for model coefficients, with both simulation and an example from a genetic study of prostate specific antigen.

Original language | English (US) |
---|---|

Pages (from-to) | 653-666 |

Number of pages | 14 |

Journal | Biostatistics |

Volume | 14 |

Issue number | 4 |

DOIs | |

State | Published - Sep 2013 |

Externally published | Yes |

## Keywords

- Adaptive Lasso
- Local false discovery rate
- Smoothing parameter
- Variable selection

## ASJC Scopus subject areas

- Statistics and Probability
- Statistics, Probability and Uncertainty