3
$\begingroup$

Let $c\in\mathbb{R}^n$ and let $X_1,X_2$ be two independent uniform samples on the unit cube in $\mathbb{R}^n$. Is there anything at all (in an analytic sense) we can say about the expectation $E\min\{c^T X_1 , c^T X_2\}$? How about if I have $k>2$ independent samples?

$\endgroup$
1
$\begingroup$

Let us provide an explicit (albeit complicated) expression for $EM_k$, where \begin{equation*} M_k:=\min_{1\le i\le k}c^TX_i \end{equation*} and $k$ is any natural number. Without loss of generality, each of the coordinates $c_j$ of the vector $c=(c_1,\dots,c_n)$ is nonzero; otherwise, one can reduce the dimension $n$. Note that almost surely \begin{equation*} s+M_k\ge0,\quad\text{where}\quad s:=\sum_j \max(0,-c_j). \end{equation*} So, \begin{multline*} EM_k=-s+E(s+M_k)=-s+\int_0^\infty P(s+M_k>x)\,dx \\ =-s+\int_{-s}^\infty P(M_k>u)\,du=-s+\int_{-s}^\infty P(c^T X_1>u)^k\,du. \end{multline*} In turn, by Theorem 1 (used here with $w=-c$ and $z=-u$), \begin{equation*} P(c^T X_1>u)=\frac{(-1)^n}{n!\prod_1^n c_j}\,\sum_{J\subseteq[n]}(-1)^{|J|}(c^T\,1_J-u)_+^n. \end{equation*} So, $P(c^T X_1>u)^k$ is a linear combination of products of the form \begin{equation*} \prod_{r=1}^k(c^T\,1_{J_r}-u)_+^n=I\{u<\min_{1\le r\le k} c^T\,1_{J_r}\}\prod_{r=1}^k(c^T\,1_{J_r}-u)^n, \end{equation*} which are piecewise polynomial. So, \begin{multline} EM_k =-s+ \Big(\frac{(-1)^n}{n!\prod_1^n c_j}\Big)^k \\ \times \sum_{J_1,\dots,J_k\subseteq[n]} (-1)^{\sum_{r=1}^k|J_r|} \int_{-s}^{\min_{1\le r\le k} c^T\,1_{J_r}} du\,\prod_{r=1}^k(c^T\,1_{J_r}-u)^n. \tag{1} \end{multline} The integrands in (1) are certain polynomials, and so, the integrals in (1) can be explicitly expressed.

$\endgroup$
0
$\begingroup$

For an upper bound, you have $E\min(X,Y)\le \min(E X,E Y)$, and so $$ E\min(c^TX_1,c^TX_2)\le c^T\cdot(1/2,\ldots,1/2)=\frac12\sum_{i=1}^n c_i.$$ For the lower bound, notice first that $\min(a,b)=(a+b-|a-b|)/2$, whence $$ E\min(c^TX_1,c^TX_2) = \frac12\sum_{i=1}^n c_i -\frac12E|c^TX_1-c^TX_2|. $$

It remains to upper-bound the latter term. H?lder's inequality comes to mind: we can bound $|c^T(X_1-X_2)|$ by $||c||_2||X_1-X_2||_2$, or by $ ||c||_1||X_1-X_2||_\infty $, or, say, by $ ||c||_\infty||X_1-X_2||_1 $.

Let's see where the first bound leads. We have $$E||X_1-X_2||_2^2=\sum_{i=1}^nE(X_1(i)-X_2(i))^2 =\frac{1}{6}n, $$ the latter is a routine calculation, see https://en.wikipedia.org/wiki/Triangular_distribution . Now $E||X_1-X_2||_2 = E\sqrt{||X_1-X_2||_2^2||} \le\sqrt{E||X_1-X_2||_2^2}=\sqrt{n/6} $.

This yields a lower bound of $$ \frac12\sum_{i=1}^n c_i -\frac{||c||_2}2\sqrt{\frac{n}6} \le E\min(c^TX_1,c^TX_2). $$

You'll get other estimates via the other applications of H?lder, which will be better or worse depending on $||c||_p$, for $p\in[1,\infty]$.

$\endgroup$

Your Answer

By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Not the answer you're looking for? Browse other questions tagged or ask your own question.