
Journal of Convex Analysis 25 (2018), No. 1, 201218 Copyright Heldermann Verlag 2018 ChanceConstrained Convex MixedInteger Optimization and Beyond: Two Sampling Algorithms within Soptimization Jesus A. De Loera Department of Mathematics, University of California, One Shields Avenue, Davis, CA 95616, U.S.A. deloera@math.ucdavis.edu Reuben N. La Haye Department of Mathematics, University of California, One Shields Avenue, Davis, CA 95616, U.S.A. rlahaye@math.ucdavis.edu Déborah Oliveros Instituto de Matemáticas, Universidad Nacional Autónoma de México (UNAM), Campus Juriquilla, 04510 México dolivero@matem.unam.mx Edgardo RoldánPensado Centro de Ciencias Matemáticas, Universidad Nacional Autónoma de México (UNAM), Campus Morelia, 04510 México e.roldan@im.unam.mx [Abstractpdf] This paper makes two contributions to optimization theory derived from new methods of discrete convex analysis. \par\medskip Our first contribution is to stochastic optimization: The scenario approach developed by Calafiore and Campi to attack chanceconstrained convex programs (i.e., optimization problems with convex constraints that are parametrized by an uncertainty parameter) utilizes random sampling on the uncertainty parameter to substitute the original problem with a deterministic continuous convex optimization with $N$ convex constraints which is a relaxation of the original. Calafiore and Campi provided an explicit estimate on the size $N$ of the sampling relaxation to yield highlikelihood feasible solutions of the chanceconstrained problem. They measured the probability of the original constraints to be violated by the random optimal solution from the relaxation of size $N$. We present a generalization of the CalafioreCampi results to both integer and mixedinteger variables. We demonstrate that their sampling estimates work naturally even for variables that take on more sophisticated values restricted to some subset $S$ of $\mathbb{R}^d$. In this way, a sampling or scenario algorithm for chanceconstrained convex mixed integer optimization algorithm is just a very special case of a stronger sampling result in convex analysis. \par\medskip Second, motivated by the first half of the paper, for a subset $S \subset \mathbb{R}^d$, we formally introduce the notion of an $S$optimization problem, where the variables take on values over $S$. $S$optimization generalizes continuous ($S=\mathbb{R}^d$), integer ($S=\mathbb{Z}^d$), and mixedinteger optimization ($S=\mathbb{R}^k \times \mathbb{Z}^{dk}$). We illustrate with examples the expressive power of $S$optimization to capture combinatorial and integer optimization problems with difficult modular constraints. We reinforce the evidence that $S$optimization is ``the right concept'' by showing that a second wellknown randomized sampling algorithm of K. Clarkson for lowdimensional convex optimization problems can be extended to work with variables taking values over $S$. The key element in all the proofs, are generalizations of Helly's theorem where the convex sets are required to intersect $S \subset \mathbb{R}^d$. The size of samples in both algorithms will be directly determined by the $S$Helly numbers. Keywords: Chanceconstrained optimization, convex mixedinteger optimization, optimization with restricted variable values, randomized sampling algorithms, Hellytype theorems, Soptimization, convexity spaces, combinatorial convexity. MSC: 90C15, 90C11, 90C48 [ Fulltextpdf (140 KB)] for subscribers only. 