Journal Home Page

Cumulative Index

List of all Volumes

Complete Contents
of this Volume

Previous Article

Next Article
 


Journal of Convex Analysis 25 (2018), No. 1, 201--218
Copyright Heldermann Verlag 2018



Chance-Constrained Convex Mixed-Integer Optimization and Beyond: Two Sampling Algorithms within S-optimization

Jesus A. De Loera
Department of Mathematics, University of California, One Shields Avenue, Davis, CA 95616, U.S.A.
deloera@math.ucdavis.edu

Reuben N. La Haye
Department of Mathematics, University of California, One Shields Avenue, Davis, CA 95616, U.S.A.
rlahaye@math.ucdavis.edu

Déborah Oliveros
Instituto de Matemáticas, Universidad Nacional Autónoma de México (UNAM), Campus Juriquilla, 04510 México
dolivero@matem.unam.mx

Edgardo Roldán-Pensado
Centro de Ciencias Matemáticas, Universidad Nacional Autónoma de México (UNAM), Campus Morelia, 04510 México
e.roldan@im.unam.mx



[Abstract-pdf]

This paper makes two contributions to optimization theory derived from new methods of discrete convex analysis. \par\medskip Our first contribution is to stochastic optimization: The scenario approach developed by Calafiore and Campi to attack chance-constrained convex programs (i.e., optimization problems with convex constraints that are parametrized by an uncertainty parameter) utilizes random sampling on the uncertainty parameter to substitute the original problem with a deterministic continuous convex optimization with $N$ convex constraints which is a relaxation of the original. Calafiore and Campi provided an explicit estimate on the size $N$ of the sampling relaxation to yield high-likelihood feasible solutions of the chance-constrained problem. They measured the probability of the original constraints to be violated by the random optimal solution from the relaxation of size $N$. We present a generalization of the Calafiore-Campi results to both integer and mixed-integer variables. We demonstrate that their sampling estimates work naturally even for variables that take on more sophisticated values restricted to some subset $S$ of $\mathbb{R}^d$. In this way, a sampling or scenario algorithm for chance-constrained convex mixed integer optimization algorithm is just a very special case of a stronger sampling result in convex analysis. \par\medskip Second, motivated by the first half of the paper, for a subset $S \subset \mathbb{R}^d$, we formally introduce the notion of an $S$-optimization problem, where the variables take on values over $S$. $S$-optimization generalizes continuous ($S=\mathbb{R}^d$), integer ($S=\mathbb{Z}^d$), and mixed-integer optimization ($S=\mathbb{R}^k \times \mathbb{Z}^{d-k}$). We illustrate with examples the expressive power of $S$-optimization to capture combinatorial and integer optimization problems with difficult modular constraints. We reinforce the evidence that $S$-optimization is ``the right concept'' by showing that a second well-known randomized sampling algorithm of K. Clarkson for low-dimensional convex optimization problems can be extended to work with variables taking values over $S$. The key element in all the proofs, are generalizations of Helly's theorem where the convex sets are required to intersect $S \subset \mathbb{R}^d$. The size of samples in both algorithms will be directly determined by the $S$-Helly numbers.

Keywords: Chance-constrained optimization, convex mixed-integer optimization, optimization with restricted variable values, randomized sampling algorithms, Helly-type theorems, S-optimization, convexity spaces, combinatorial convexity.

MSC: 90C15, 90C11, 90C48

[ Fulltext-pdf  (140  KB)] for subscribers only.