Wikipedia:Reference desk/Archives/Mathematics/2020 December 7

= December 7 =

$$\int_I \exp(-f(x))dx\geq \int_I \exp(-g(x))dx$$ if $$\int_I (f(x) - g(x))\exp(-g(x))dx = 0$$
If for two functions $$f(x) $$ and $$g(x)$$ we have that:


 * $$\int_I (f(x) - g(x))\exp(-g(x))dx = 0 $$ (1)

where $$I $$ is an arbitrary interval on the real line, then


 * $$\int_I \exp(-f(x))dx\geq \int_I \exp(-g(x))dx $$ (2)

This follows in a straightforward way from the Bogoliubov inequality. It seems to be a reasonable powerful tool to get to sharp bounds for certain integrals or summations, but it doesn't seem to be used a lot for this purpose in mathematics. I was wondering if a more straightforward proof can be given for the special case of integrals and summations.

This can be used to obtain sharp lower bounds for integrals of the form $$\int_I \exp(-f(x))dx $$ by writing down a function  $$g(x) $$ containing parameters, and then imposing the constraint (1) to eliminate one parameter and then maximizing  $$\int_I \exp(-g(x))dx $$ w.r.t. the remaining parameters. A simple example is to take $$f(x) = x^2 $$ and  $$g(x) = a + b x $$ for  $$b>0 $$ and  $$I $$ the positive real line, which yields the inequality  $$\pi \geq e $$.

Count Iblis (talk) 00:11, 7 December 2020 (UTC)