### Abstract

This article explores numerical methods for stochastic optimization, with special attention to Bayesian design problems. A common and challenging situation occurs when the objective function (in Bayesian applications, the expected utility) is very expensive to evaluate, perhaps because it requires integration over a space of very large dimensionality. Our goal is to explore a class of optimization algorithms designed to gain efficiency in such situations, by exploiting smoothness of the expected utility surface and borrowing information from neighboring design points. The central idea is that of implementing stochastic optimization by curve fitting of Monte Carlo samples. This is done by simulating draws from the joint parameter/sample space and evaluating the observed utilities. Fitting a smooth surface through these simulated points serves as estimate for the expected utility surface. The optimal design can then be found deterministically. In this article we introduce a general algorithm for curve-fitting-based optimization, discuss implementation options, and present a consistency property for one particular implementation of the algorithm. To illustrate the advantages and limitations of curve-fitting-based optimization, and to compare it with some of the alternatives, we consider in detail two important practical applications: an information theoretical stopping rule for a clinical trial, with an objective function based on the expected amount of information acquired about a subvector of parameters of interest, and the design of exploratory shock levels in the implantation of heart defibrillators. This latter example is also used for comparison with some of the alternative schemes. One of the main attractions of efficient optimization algorithms in design is the application to sequential problems. We conclude with an outlook on how the ideas presented here can be extended to solve stochastic dynamic programming problems such as those occurring in Bayesian sequential design.

Original language | English (US) |
---|---|

Pages (from-to) | 1322-1330 |

Number of pages | 9 |

Journal | Journal of the American Statistical Association |

Volume | 90 |

Issue number | 432 |

DOIs | |

State | Published - 1995 |

Externally published | Yes |

### Fingerprint

### Keywords

- Bayesian design
- Optimal statistical decisions
- Simulation
- Stochastic optimization

### ASJC Scopus subject areas

- Statistics and Probability
- Statistics, Probability and Uncertainty

### Cite this

*Journal of the American Statistical Association*,

*90*(432), 1322-1330. https://doi.org/10.1080/01621459.1995.10476636

**Optimal design via curve fitting of Monte Carlo experiments.** / Müller, Peter; Parmigiani, Giovanni.

Research output: Contribution to journal › Article

*Journal of the American Statistical Association*, vol. 90, no. 432, pp. 1322-1330. https://doi.org/10.1080/01621459.1995.10476636

}

TY - JOUR

T1 - Optimal design via curve fitting of Monte Carlo experiments

AU - Müller, Peter

AU - Parmigiani, Giovanni

PY - 1995

Y1 - 1995

N2 - This article explores numerical methods for stochastic optimization, with special attention to Bayesian design problems. A common and challenging situation occurs when the objective function (in Bayesian applications, the expected utility) is very expensive to evaluate, perhaps because it requires integration over a space of very large dimensionality. Our goal is to explore a class of optimization algorithms designed to gain efficiency in such situations, by exploiting smoothness of the expected utility surface and borrowing information from neighboring design points. The central idea is that of implementing stochastic optimization by curve fitting of Monte Carlo samples. This is done by simulating draws from the joint parameter/sample space and evaluating the observed utilities. Fitting a smooth surface through these simulated points serves as estimate for the expected utility surface. The optimal design can then be found deterministically. In this article we introduce a general algorithm for curve-fitting-based optimization, discuss implementation options, and present a consistency property for one particular implementation of the algorithm. To illustrate the advantages and limitations of curve-fitting-based optimization, and to compare it with some of the alternatives, we consider in detail two important practical applications: an information theoretical stopping rule for a clinical trial, with an objective function based on the expected amount of information acquired about a subvector of parameters of interest, and the design of exploratory shock levels in the implantation of heart defibrillators. This latter example is also used for comparison with some of the alternative schemes. One of the main attractions of efficient optimization algorithms in design is the application to sequential problems. We conclude with an outlook on how the ideas presented here can be extended to solve stochastic dynamic programming problems such as those occurring in Bayesian sequential design.

AB - This article explores numerical methods for stochastic optimization, with special attention to Bayesian design problems. A common and challenging situation occurs when the objective function (in Bayesian applications, the expected utility) is very expensive to evaluate, perhaps because it requires integration over a space of very large dimensionality. Our goal is to explore a class of optimization algorithms designed to gain efficiency in such situations, by exploiting smoothness of the expected utility surface and borrowing information from neighboring design points. The central idea is that of implementing stochastic optimization by curve fitting of Monte Carlo samples. This is done by simulating draws from the joint parameter/sample space and evaluating the observed utilities. Fitting a smooth surface through these simulated points serves as estimate for the expected utility surface. The optimal design can then be found deterministically. In this article we introduce a general algorithm for curve-fitting-based optimization, discuss implementation options, and present a consistency property for one particular implementation of the algorithm. To illustrate the advantages and limitations of curve-fitting-based optimization, and to compare it with some of the alternatives, we consider in detail two important practical applications: an information theoretical stopping rule for a clinical trial, with an objective function based on the expected amount of information acquired about a subvector of parameters of interest, and the design of exploratory shock levels in the implantation of heart defibrillators. This latter example is also used for comparison with some of the alternative schemes. One of the main attractions of efficient optimization algorithms in design is the application to sequential problems. We conclude with an outlook on how the ideas presented here can be extended to solve stochastic dynamic programming problems such as those occurring in Bayesian sequential design.

KW - Bayesian design

KW - Optimal statistical decisions

KW - Simulation

KW - Stochastic optimization

UR - http://www.scopus.com/inward/record.url?scp=21344450598&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=21344450598&partnerID=8YFLogxK

U2 - 10.1080/01621459.1995.10476636

DO - 10.1080/01621459.1995.10476636

M3 - Article

AN - SCOPUS:21344450598

VL - 90

SP - 1322

EP - 1330

JO - Journal of the American Statistical Association

JF - Journal of the American Statistical Association

SN - 0162-1459

IS - 432

ER -