Hostname: page-component-cd9895bd7-7cvxr Total loading time: 0 Render date: 2024-12-26T16:43:45.370Z Has data issue: false hasContentIssue false

The Bruss–Robertson–Steele inequality

Published online by Cambridge University Press:  13 March 2023

L. C. G. Rogers*
Affiliation:
University of Cambridge
*
*Postal address: Statistical Laboratory, University of Cambridge, Centre for Mathematical Sciences, Wilberforce Road, Cambridge CB3 0WB. Email: chris@statslab.cam.ac.uk
Rights & Permissions [Opens in a new window]

Abstract

The Bruss–Robertson–Steele (BRS) inequality bounds the expected number of items of random size which can be packed into a given suitcase. Remarkably, no independence assumptions are needed on the random sizes, which points to a simple explanation; the inequality is the integrated form of an $\omega$-by-$\omega$ inequality, as this note proves.

Type
Original Article
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of Applied Probability Trust

1. The basic problem

The Bruss–Robertson–Steele (BRS) inequality was first proved in [Reference Bruss and Robertson2], and later generalized in [Reference Steele3]. The recent survey in [Reference Bruss1] gives a fine review.

You have N objects which you would like to take in your suitcase on a flight. The weight of object j is $Z_j$ , but the total weight you are allowed to take on the flight must not exceed $s>0$ . You want to maximise the number of objects that you can take, subject to this constraint. This can be posed as a linear programme:

\begin{equation*} \max_{x \geq 0} \; \sum_i x_i \quad {\rm subject \, to} \quad x_i \leq 1 \ \text{for all } i, \ \sum_i x_i\,Z_i \leq s.\end{equation*}

Strictly, we have to have that each $x_i$ is in $\{0,1\}$ , but this additional constraint will only reduce the value; since we are looking for upper bounds, the gap here will help us. We can write this in canonical matrix form,

\begin{equation*} \max_{x \geq 0} c^\top x \quad {\rm subject \, to}\quad Ax \leq b,\end{equation*}

where $c^\top = (1, \ldots,1)$ , $b^\top = (1, \ldots, 1,s)$ , and

\begin{equation*} A = \begin{pmatrix} I \\[4pt] Z^\top \end{pmatrix}.\end{equation*}

The dual linear programme is

\begin{equation*} \min_{y \geq 0} \; b^\top y \quad{\rm subject \, to}\quad c \leq A^\top y.\end{equation*}

Written out more fully, this is

(1) \begin{equation} \min_{y \geq 0} \; \sum_{j=1}^N y_j + s y_{N+1} \quad {\rm subject \, to}\quad 1 \leq y_j + Z_j y_{N+1} \ \text{for all } j. \end{equation}

The value of the dual problem is the value of the primal problem (e.g. [Reference Whittle4, Section 4.2]), and for any dual-feasible y the value $b^\top y$ is an upper bound for the value of the problem. If we write $y_{N+1} = \eta$ for short, the problem in (1) requires

\begin{equation*} \min_{y \geq 0} \; \sum_{j=1}^N y_j + \eta s \quad {\rm subject \, to}\quad 1 \leq y_j + \eta Z_j \ \text{for all } j.\end{equation*}

Obviously, once $\eta>0$ has been chosen, the best dual-feasible choice of $y_1, \ldots, y_N$ will be $y_j = (1-\eta Z_j)^+$ . Thus, for any $\eta>0$ , the value $\Phi^*$ of the problem is bounded above by

\begin{equation*} \Phi(\eta) \equiv \sum_{j=1}^N (1-\eta Z_j)^+ + \eta s,\end{equation*}

which is clearly a convex piecewise-linear function of $\eta$ .

2. The BRS inequality

In the problem studied by Bruss and Robertson, and later in greater generality by Steele, $Z_1, \ldots, Z_N$ are positive random variables, and the distribution function of $Z_j$ is $F_j$ , assumed for convenience to be continuous. In this situation, the value $\Phi^*$ of the suitcase packing problem will of course be random, and the BRS inequality gives an upper bound for $E \Phi^*$ . Let us see how the BRS inequality follows easily from the linear-programming story of the previous section.

Clearly, for any $\eta>0$ we have

(2) \begin{align} \mathbb{E} \Phi^* \leq \mathbb{E} \Phi(\eta) & = \mathbb{E} \sum_{j=1}^N (1-\eta Z_j)^+ + \eta s \nonumber \\ & = \sum_{j=1}^N \int_0^{1/\eta} (1-\eta z)\; F_j(\mathrm{d} z) + \eta s. \end{align}

Now we optimize the bound in (2) by differentiating:

\begin{equation*} 0 = - \sum_{j=1}^N \int_0^{1/\eta} z \; F_j(\mathrm{d} z) + s ,\end{equation*}

which will be satisfied when $\eta = \eta^*$ , the root of

(3) \begin{equation} \sum_{j=1}^N \int_0^{1/\eta}z \; F_j(\mathrm{d} z) = s. \end{equation}

Taking $\eta = \eta^*$ and using (3), the bound in (2) for $\mathbb{E} \Phi^*$ is easily seen to be

\begin{equation*} \mathbb{E}\Phi^* \leq \sum_{j=1}^N \int_0^{1/\eta^*} F_j(\mathrm{d} x),\end{equation*}

which is the BRS inequality.

Remark 1. Even if the implicit equation in (3) cannot be solved explicitly, the bound in (2) can still be applied for any choice of $\eta$ .

Acknowledgements

Thanks to Thomas Bruss for drawing this interesting inequality to my attention.

Funding information

There are no funding bodies to thank relating to the creation of this article.

Competing interests

There were no competing interests to declare which arose during the preparation or publication process of this article.

References

Bruss, F. T. (2021). The BRS-inequality and its applications. Prob. Surv. 18, 4476.CrossRefGoogle Scholar
Bruss, F. T. and Robertson, J. B. (1991). Wald’s Lemma for sums of order statistics of IID random variables. Adv. Appl. Prob. 23, 612623.10.2307/1427625CrossRefGoogle Scholar
Steele, J. M. (2015). The Bruss–Robertson inequality: Elaborations, extensions, and applications. Preprint, arXiv:1510.00843.Google Scholar
Whittle, P. Optimization under Constraints: Theory and Applications of Nonlinear Programming. Wiley, Chichester, 1971.Google Scholar