We consider how to efficiently allocate computing resources in order to infer the best of a finite set of simulated systems, where best means that the system has the maximal expected performance measure. Commonly-used frequentist procedures that are based on the indifference zone and `worst possible configuration' tend to suggest an inefficiently large number of replications in practice. Recent work suggests that simulating likely competitors for the `best' may lead to an order of magnitude improvement in computing effort for simulations. Much of that work, however, makes strong assumptions that might not be seen in practice, such as known variance, or the same cost of running a replication for each system. This paper discusses the problem of allocating computer resources to identify the best simulated system while relaxing general conditions, including different cost per replication for each system, both opportunity cost (linear loss) and 0-1 loss, and known or unknown vari...
Stephen E. Chick, Koichiro Inoue