User:FacSciences ULB/sandbox

BRS-inequality is the short name for Bruss-Robertson-Steele inequality. This inequality gives a convenient upper bound for the expected maximum number of non-negative random variables one can sum up without exceeding a given upper bound $$s > 0$$.

For example, suppose 100 random variables $$X_1, X_2,..., X_{100}$$ are all uniformly distributed on $$[0, 1]$$, not necessarily independent, and let $$s= 10$$, say. Let $$N[n, s] := N[100, 10]$$ be the maximum number of $$X_j$$ one can select in $$\{X_1, X_2,..., X_{100}\}$$ such that their sum does not exceed $$s= 10$$. $$N[100, 10]$$ is a random variable, so what can one say about bounds for its expectation? How would an upper bound for $$E(N[n, s])$$ behave, if one changes the size $$n$$ of the sample and keeps $$s$$ fixed, or alternatively, if one keeps $$n $$ fixed but varies $$s$$? What can one say about $$E(N[n, s])$$, if the uniform distribution is replaced by another continuous distribution? In all generality, what can one say if each $$X_k$$ may have its own continuous distribution function $$F_k$$?

General Problem
Let $$X_1, X_2,...$$ be a sequence of non-negative random variables (possibly dependent) that are jointly continuously distributed. For $$n \in \{1, 2,... \}$$ and $$s\in R^+$$ let $$N[n, s]$$ be the maximum number of observations among $$X_1, X_2, ..., X_n$$ that one can sum up without exceeding $$s$$.

Now, to obtain $$N[n, s]$$ one may think of looking at the list of all observations, first select the smallest one, then add the second smallest, then the third and so on, as long as the accumulated sum does not exceed $$s$$. Hence $$N[s, n]$$ can be defined in tertms of the increasing order statistics of $$X_1, X_2,\cdots, X_n$$, denoted by $$ X_{1,n} \le X_{2,n} \le \cdots \le X_{n,n},$$, namely by



\begin{align} N[n, s] = \begin{cases} 0 &,   {\rm ~if~}~ X_{1,n} > s,\\ \max\{~k \in \N :~ X_{1,n} + X_{2,n}+ \cdots + X_{k,n} \le s\} &, {\rm~ otherwise}. \end{cases} \end{align} (1) $$

What is the best possible general upper bound for $$E(N[n, s])$$ if one requires only the continuity of the joint distribution of all variables? And then, how to compute this bound?

Identically distributed random variables.
Theorem 1 Let $$X_1, X_2, \cdots, X_n$$ be identically distributed non-negative random variables with absolutely continuous distribution function $$F$$. Then


 * $$ E(N[n, s])\le n F(t),$$    (2)

where $$t := t(n, s)$$ solves the so-called BRS-equation


 * $$ n \int_0^t x \,dF(x)\, =\, s$$.  (3)

As an example, here are the answers for the questions posed at the beginning. Thus let all $$X_1, X_2,\cdots, X_n$$ be uniformly distributed on $$[0, 1]$$. Then $$F(t) = t$$ on $$[0, 1]$$, and hence $$dF(x)/dx = 1$$ on $$[0, 1]$$. The BRS-equation becomes


 * $$ n \int_0^t x dx = n t^2/2 = s.$$

The solution is $$t =\sqrt{2s/n}$$, and thus from the inequality (2)


 * $$ E(N[n, s]) \le n\,F(t) = n \sqrt{2s/n }= \sqrt{2sn}$$. (4)

Since one always has $$N[n, s] \le n$$, this bound becomes trivial for $$s \ge nE(X) = n/2$$.

For the example questions with $$n=100, s=10 $$ this yields $$  E(N[100, 10]) \le \sqrt{2000} \approx 44.7$$. As one sees from (4), doubling the sample size $$n$$ and keeping $$s$$ fixed, or vice versa, yield for the uniform distribution in the non-trivial case the same upper bound.

Generalised BRS-inequality
Theorem 2. Let $$X_1, X_2,\cdots, X_n$$ be positive random variables that are jointly distributed such that $$X_k$$ has an absolutely continuous distribution function $$F_k, ~k=1, 2, \cdots,n.$$. If $$N[n, s]$$ is defined as before, then


 * $$E (N[n, s])\le \sum_{k=1}^n F_k(t)$$, (5)

where $$ t := t(n, s)$$ is the unique solution of the BRS-equation


 * $$\sum_{k=1}^n \int_0^t \,x\,dF_k(x) = s.$$ (6)

Clearly, if all random variables $$X_i, i=1, 2, \cdots, n $$ have the same marginal distribution $$F$$, then (6) recaptures (3), and (5) recaptures (2). Again it should be pointed out that no independence hypothesis whatsoever is needed.

=Refinements of the BRS-inequality =

Depending on the type of the distributions $$ F_k$$, refinements of Theorem 2 can be of true interest. We just mention one of them.

Let $$A[n, s]$$ be the random set of those variables one can sum up to yield the maximum random number $$N[n, s]$$, that is,
 * $$\#A[n, s] = N[n, s]$$,

and let $$S_{A[n,s]}$$ denote the sum of these variables. The so-called residual $$ s-S_{A[n,s]}$$ is by definition always non-negative, and one has:

Theorem 3. Let $$X_1, X_2,\cdots, X_n$$ be jointly continuously distributed with marginal distribution functions $$ F_k, k=1, 2, \cdots,n$$, and let $$t := t(n, s)$$ be the solution of (6). Then


 * $$E(N[n, s])\le \left( \sum_{k=1}^n F_k(t(n, s))\right)-\frac{s-E(S_{A[n,s]})}{t(n,s)} $$. (7)

The improvement in (7) compared with (5) therefore consists of


 * $$\frac{s-E(S_{A[n,s]})}{t(n,s)}$$.

The expected residual in the numerator is typically difficult to compute or estimate, but there exist nice exceptions. For example, if all $$X_k$$ are independent exponential random variables, then the memoryless property implies (if s is exceeded) the distributional symmetry of the residual and the overshoot over $$s$$. For fixed $$s$$ one can then show that :$$\frac{s-E(S_{A[n,s]})}{t(n,s)} \to 1/2 {\rm~as~} n \to \infty$$. This improvement fluctuates around $$1/2$$, and the convergence to $$1/2$$, (simulations) seems quick.

Source
The first version of the BRS-inequality (Theorem 1) was proved in Lemma 4.1 of F. Thomas Bruss and James B. Robertson (1991). This paper proves moreover that the upper bound is asymptotically tight if the random variables are independent of each other. The generalisation to arbitrary continuous distributions (Theorem 2) is due to J. Michael Steele (2016). Theorem 3 and other refinements of the BRS-inequality are more recent and proved in Bruss (2021).

Applications
The BRS-inequality is a versatile tool since it applies to many types of problems, and since the computation of the BRS-equality is often not very involved. Also, and in particular, one notes that the maximum number $$N[n, s]$$ always dominates the maximum number of selections under any additional constraint, such as e.g. for online selections without recall. Examples studied in Steele (2016) and Bruss (2021) touch many applications, including comparisons between i.i.d. sequences and non-i.i.d. sequences, problems of condensing point processes, “awkward” processes, selection algorithms, knapsack problems, Borel-Cantelli-type problems, the Bruss-Duerinckx theorem, and online Tiling strategies.