User:Cosclair/sandbox

The Ellipsoid Function
In the context of Mathematical Optimization, the Ellipsoid function is a convex and parameterized generalization of the sphere function:

$$ \begin{align} f_{a}(x) &= \frac{1}{2}\sum_{i=1}^{N}a_ix_i^2 \\ a_i &= a^\frac{i-1}{N-1} \end{align} $$

where $$ a_i \geq 1, N>1 $$. This function is of particular interest due to the scaling effect in higher dimensions. Notice that when $$ a=1 $$, we can retrieve the sphere function, however, as $$ a $$ gets bigger, the scaling gets significantly stronger. The increase in scaling results in an increase in the Condition number and hence making it more difficult to reach the optimum as the function is very sensitive to steps.

From the first order partial derivative we can observe that the step in the last dimension is proportional to $$ a $$: $$ \frac{\partial}{\partial x}f^\prime_{a}(x)= (x, a_2^{1/N}x, ..., a_N x)

$$, hence the algorithm will have to make a very small step to account for the scaling and the large change. From the second order partial derivative $$ \frac{\partial^2}{\partial^2 x}f^\prime_{a}(x)= \text{diag}(1, a^{-N}, ..., a) $$, we can retrieve the condition number of the Hessian: $$ \frac{\sigma_{max}}{\sigma_{min}} = \frac{a}{1} = a $$, and therefore we may lose up to $$ \log_{10}(a) $$ digits of accuracy on top of what would be lost to the numerical method due to loss of precision from arithmetic methods.