Entity

Time filter

Source Type

Piscataway, NJ, United States

Prekopa A.,Rutgers Center for Operations Research | Unuvar M.,IBM
Operations Research | Year: 2015

Single commodity networks are considered, where demands at the nodes are random. The problem is to find minimum cost optimal built in capacities at the nodes and arcs subject to the constraint that all demands should be met on a prescribed probability level (reliability constraint) and some deterministic constraints should be satisfied. The reliability constraint is formulated in terms of the Gale-Hoffman feasibility inequalities, but their number is reduced by elimination technique. The concept of a p-efficient point is used in a smart way to convert and then relax the problem into an LP. The p-efficient points are simultaneously generated with the solution of the LP. The joint distribution of the demands is used to obtain the p-efficient points for all stochastic inequalities that were not eliminated and the solution of a multiple choice knapsack problem is used to generate p-efficient points. The model can be applied to planning in interconnected power systems, flood control networks, design of shelter and road capacities in evacuation, parking lot capacities, financial networks, cloud computing system design, etc. Numerical examples are presented. © 2015 INFORMS. Source


Subasi M.M.,Florida Institute of Technology | Subasi E.,Rutgers Center for Operations Research | Anthony M.,The London School of Economics and Political Science
International Symposium on Artificial Intelligence and Mathematics, ISAIM 2012 | Year: 2012

In data analysis problems where the data are represented by vectors of real numbers, it is often the case that some of the data points will have "missing values", meaning that one or more of the entries of the vector that describes the data point is not observed. In this paper, we propose a new approach to the imputation of missing binary values that employs a "similarity measure". We compare experimentally the performance of our technique with ones based on the usual Hamming distance measure and multiple imputation. Source


Prkopa A.,Rutgers Center for Operations Research | Yoda K.,Rutgers Center for Operations Research | Subasi M.M.,Florida Institute of Technology
Operations Research Letters | Year: 2011

A probabilistic constrained stochastic linear programming problem is considered, where the rows of the random technology matrix are independent and normally distributed. The quasi-concavity of the constraining function needed for the convexity of the problem is ensured if the factors of the function are uniformly quasi-concave. A necessary and sufficient condition is given for that property to hold. It is also shown, through numerical examples, that such a special problem still has practical application in optimal portfolio construction. © 2011 Elsevier B.V. All rights reserved. Source


Ninh A.,Rutgers Center for Operations Research | Prekopa A.,Rutgers Center for Operations Research
Operations Research Letters | Year: 2013

Discrete moment problems (DMP) with integer moments were first introduced by Prékopa to provide sharp lower and upper bounds for functions of discrete random variables. Prékopa also developed fast and stable dual type linear programming methods for the numerical solutions of the problem. In this paper, we assume that some fractional moments are also available and propose basic theory and a solution method for the bounding problems. Numerical experiments show significant improvement in the tightness of the bounds. © 2013 Published by Elsevier B.V. Source


Fix A.,Cornell University | Gruber A.,Rutgers Center for Operations Research | Boros E.,Rutgers Center for Operations Research | Zabih R.,Cornell University
IEEE Transactions on Pattern Analysis and Machine Intelligence | Year: 2015

Higher-order Markov Random Fields, which can capture important properties of natural images, have become increasingly important in computer vision. While graph cuts work well for first-order MRF's, until recently they have rarely been effective for higher-order MRF's. Ishikawa's graph cut technique [1], [2] shows great promise for many higher-order MRF's. His method transforms an arbitrary higher-order MRF with binary labels into a first-order one with the same minima. If all the terms are submodular the exact solution can be easily found; otherwise, pseudoboolean optimization techniques can produce an optimal labeling for a subset of the variables. We present a new transformation with better performance than [1], [2], both theoretically and experimentally. While [1], [2] transforms each higher-order term independently, we use the underlying hypergraph structure of the MRF to transform a group of terms at once. For n binary variables, each of which appears in terms with k other variables, at worst we produce n non-submodular terms, while [1], [2] produces O(nk). We identify a local completeness property under which our method perform even better, and show that under certain assumptions several important vision problems (including common variants of fusion moves) have this property. We show experimentally that our method produces smaller weight of non-submodular edges, and that this metric is directly related to the effectiveness of QPBO [3]. Running on the same field of experts dataset used in [1], [2] we optimally label significantly more variables (96 versus 80 percent) and converge more rapidly to a lower energy. Preliminary experiments suggest that some other higher-order MRF's used in stereo [4] and segmentation [5] are also locally complete and would thus benefit from our work. © 2014 IEEE. Source

Discover hidden collaborations