Time filter

Source Type

Xie J.,Credibly | Frazier P.I.,Cornell University | Chick S.E.,Technology and Operations Management Area
Operations Research | Year: 2016

This paper addresses discrete optimization via simulation. We show that allowing for both a correlated prior distribution on the means (e.g., with discrete Kriging models) and sampling correlation (e.g., with common random numbers, or CRN) can significantly improve the ability to quickly identify the best alternative. These two correlations are brought together for the first time in a highly sequential knowledge-gradient sampling algorithm, which chooses points to sample using a Bayesian value of information (VOI) criterion. We provide almost sure convergence guarantees as the number of samples grows without bound when parameters are known and provide approximations that allow practical implementation including a novel use of the VOI's gradient rather than the response surface's gradient. We demonstrate that CRN leads to improved optimization performance for VOI-based algorithms in sequential sampling environments with a combinatorial number of alternatives and costly samples. © 2016 INFORMS.

Aral K.D.,Technology and Operations Management Area | Chick S.E.,Technology and Operations Management Area | Grabosch A.,National Health Insurance Company Daman
Proceedings - Winter Simulation Conference | Year: 2015

Type 2 Diabetes Mellitus (T2DM) and its complications account for 11% of the global health expenditure (IDF 2012). Different primary, secondary, and tertiary preventive interventions promise better health outcomes and cost savings but are often studied separately. This paper proposes a simulation model for T2DM that comprehends the nonlinear interactions of multiple interventions for various stages of T2DM on population dynamics, health outcomes, and costs. We summarize the model, then demonstrate how we addressed the important challenge of fitting input parameters given that data needed to be combined from disparate sources of data sources in a way that calibrates input parameters to output metrics over a range of decision variables (a form of model calibration to achieve a response model match to clinical data). We present preliminary numerical results to inform policies for T2DM prevention and management. © 2014 IEEE.

Yoo T.,Pohang University of Science and Technology | Cho H.,Pohang University of Science and Technology | Yucesan E.,Technology and Operations Management Area
Expert Systems with Applications | Year: 2010

Supply chain optimization, as a key determinant of strategic resources mobility along the value-added chain, allows each participant in the global network to capitalize on its particular strategic competency. Simulation is widely used to test the impact on supply chain performance for the strategic level decisions, such as the number of plants, the modes of transport, or the relocation of warehouses. However, the complexity of supply chain optimization problem and the stochastic nature of simulation cause the unaffordable computational load; the evaluation of a large number of alternatives for supply chain optimization is in a class of NP-hard problem and the number of simulation replications is required for accurately evaluating the performance of each alternative. The objective of the present work is to propose hybrid algorithm with the application of the nested partitioning (NP) method and the optimal computing budget allocation (OCBA) method to reduce the computational load, hence, to improve the efficiency of supply chain optimization via discrete event simulation. The NP method is a global sampling strategy that is continuously adapted via a partitioning of the feasible solution region. The number of candidate alternatives to be evaluated can be reduced by the application of NP. The OCBA method minimizes the number of samples (simulation replications) required to evaluate a particular alternative by allocating computing resources to potentially critical alternative. Carefully designed experiments show extensive numerical result to illustrate the benefits of the proposed approach. © 2009 Elsevier Ltd. All rights reserved.

Frazier P.I.,Cornell University | Xie J.,Cornell University | Chick S.E.,Technology and Operations Management Area
Proceedings - Winter Simulation Conference | Year: 2011

We consider optimization via simulation over a finite set of alternatives. We employ a Bayesian value-of-information approach in which we allow both correlated prior beliefs on the sampling means and correlated sampling. Correlation in the prior belief allow us to learn about an alternative's value from samples of similar alternatives. Correlation in sampling, achieved through common random numbers, allows us to reduce the variance in comparing one alternative to another. We allow for a more general combination of both types of correlation than has been offered previously in the Bayesian ranking and selection literature. We do so by giving an exact expression for the value of information for sampling the difference between a pair of alternatives, and derive new knowledge-gradient methods based on this valuation. © 2011 IEEE.

Bendavid I.,ORT Braude College | Herer Y.T.,Technion - Israel Institute of Technology | Yucesan E.,Technology and Operations Management Area
Proceedings - Winter Simulation Conference | Year: 2015

The objective of inventory management models is to determine effective policies for managing the trade-off between customer satisfaction and the cost of service. These models have become increasingly sophisticated, incorporating many complicating factors that are relevant in practice such as demand uncertainty, finite supplier capacity, and yield losses. Curiously absent from these models are the financial constraints imposed by working capital requirements (WCR). In practice, many firms are self-financing; their ability to replenish their own inventories is directly affected not only by their current inventory levels, but also by their receivables and payables. In this paper, we analyze the materials management practices of a self-financing firm whose replenishment decisions are constrained by cash flows, which are updated periodically following purchases and sales in each period. In particular, we investigate the interaction between the financial and operational parameters as well as the impact of WCR constraints on the long-run average cost. © 2014 IEEE.

Sosa M.,Technology and Operations Management Area | Mihm J.,Technology and Operations Management Area | Browning T.,Texas Christian University
Journal of Mechanical Design, Transactions of the ASME | Year: 2011

Complex engineered systems tend to have architectures in which a small subset of components exhibits a disproportional number of linkages. Such components are known as hubs. This paper examines the degree distribution of systems to identify the presence of hubs and quantify the fraction of hub components. We examine how the presence and fraction of hubs relate to a system's quality. We provide empirical evidence that the presence of hubs in a system's architecture is associated with a low number of defects. Furthermore, we show that complex engineered systems may have an optimal fraction of hub components with respect to system quality. Our results suggest that architects and managers aiming to improve the quality of complex system designs must proactively identify and manage the use of hubs. Our paper provides a data-driven approach for identifying appropriate target levels of hub usage. © 2011 American Society of Mechanical Engineers.

Yucesan E.,Technology and Operations Management Area
IIE Transactions (Institute of Industrial Engineers) | Year: 2015

Breakthrough innovation has two key prerequisites: idea generation, collection of a large number of competing designs, and idea screening, efficient evaluation, and ranking of these designs to identify the best one(s). Open innovation has recently been modeled and analyzed as innovation contests, where many individuals or teams submit designs or prototypes to an innovating firm. Innovation tournaments increase the capacity of idea generation by enabling access to a broad pool of solvers while avoiding exorbitant costs. To deliver on their promise, however, such tournaments must be designed to enable effective screening of proposed ideas. In particular, given the large number of designs to be evaluated, tournaments must be efficient, favoring quick judgments based on imperfect information over extensive data collection. Through a simulation study, this article shows that contests may not necessarily be the best process for ranking innovation opportunities and selecting the best ones in an efficient way. Instead, we propose a ranking and selection approach that is based on ordinal optimization, which provides both efficiency and accuracy by dynamically allocating evaluation effort away from inferior designs onto promising ones.Anumerical example quantifies the benefits. The proposed approach should therefore complement innovation tournaments' ability of idea generation with efficient idea screening. © 2013 "IIE".

Van Den Broeke M.,Technology and Operations Management Area | Van Den Broeke M.,Catholic University of Leuven | Boute R.,Technology and Operations Management Area | Boute R.,Catholic University of Leuven | Samii B.,Technology and Operations Management Area
International Journal of Production Research | Year: 2015

Over the past decades, several companies have introduced product platforms in the design of their products in order to produce a large product variety in a cost-efficient way. However, for some companies, the introduction of platforms ended up being more costly than expected, leading them to reconsider their platform decisions. In this paper, we develop a model to support companies in determining (1) how many platforms to develop, (2) which platforms to develop and (3) which products to derive from which platforms. The model takes into account the impact of these product-platform decisions on a company's relevant supply chain activities and costs. The model shows how the optimal product-platform decisions depend on the trade-off between the costs of platforms versus the costs of customising these platforms to final product variants. We propose a simulated annealing algorithm to solve large problem instances within reasonable time. The practical validity of our model is shown through its application in a global technology company specialised in the development and production of medical screens. © 2015 Taylor & Francis.

Pasupathy R.,Virginia Polytechnic Institute and State University | Szechtman R.,Naval Postgraduate School, Monterey | Yucesan E.,Technology and Operations Management Area
Proceedings - Winter Simulation Conference | Year: 2010

Ranking and selection (R&S) techniques are statistical methods developed to select the best system, or a subset of systems from among a set of alternative system designs. R&S via simulation is particularly appealing as it combines modeling flexibility of simulation with the efficiency of statistical techniques for effective decision making. The overwhelming majority of the R&S research, however, focuses on the expected performance of competing designs. Alternatively, quantiles, which provide additional information about the distribution of the performance measure of interest, may serve as better risk measures than the usual expected value. In stochastic systems, quantiles indicate the level of system performance that can be delivered with a specified probability. In this paper, we address the problem of ranking and selection based on quantiles. In particular, we formulate the problem and characterize the optimal budget allocation scheme using the large deviations theory. ©2010 IEEE.

Gong Y.,EMLYON Business School | Yucesan E.,Technology and Operations Management Area
International Journal of Production Economics | Year: 2012

Transshipments, monitored movements of material at the same echelon of a supply chain, represent an effective pooling mechanism. Earlier papers dealing with transshipments either do not incorporate replenishment lead times into their analysis, or only provide a heuristic algorithm where optimality cannot be guaranteed beyond settings with two locations. This paper uses infinitesimal perturbation analysis by combining with a stochastic approximation method to examine the multi-location transshipment problem with positive replenishment lead times. It demonstrates the computation of optimal base stock quantities through sample path optimization. From a methodological perspective, this paper deploys a duality-based gradient computation method to improve computational efficiency. From an application perspective, it solves transshipment problems with non-negligible replenishment lead times. A numerical study illustrates the performance of the proposed approach. © 2010 Elsevier B.V. All rights reserved.

Loading Technology and Operations Management Area collaborators
Loading Technology and Operations Management Area collaborators