User:Emvlatakis/sandbox

Galina Mikhailovna Korpelevich (Russian Галина Михайловна Корпелевич, born 14 July 1937 – 30 November 1985) was a Soviet mathematician, who made significant contributions to many fields in mathematics and science.She is known for invention of extra-gradient method for solving variational inequalities.

Biography
Bla Bla Bla

Extra Gradient Method in Monotone Variational Inequality Problem
In functional analysis on a topological vector space $$H$$, a (possibly non-linear) a functional $$F\colon \boldsymbol{H}\to \boldsymbol{H}^{\ast}$$ from $$\boldsymbol{H}$$ to the dual space $$\boldsymbol{H}^{\ast}$$ of the space $$\boldsymbol{H}$$ is said to be monotone operator if $$\langle F(x)-F(y), x-y \rangle \geq 0\qquad\forall \ x,y \in \boldsymbol{H}$$, where $$\langle\cdot,\cdot\rangle\colon \boldsymbol{H}^{\ast}\times\boldsymbol{H}\to \mathbb{R}$$ is the duality pairing.

Given a monotone operator $$F\colon \boldsymbol{H}\to \boldsymbol{H}^{\ast}$$, the monotone variational inequality (MVI) problem is the problem of solving for the variable $$x^*$$ belonging to $$\boldsymbol{Q}\subseteq\boldsymbol{H}$$ the following inequality:


 * $$\langle F(x^*), y-x^* \rangle \geq 0\qquad\forall y \in \boldsymbol{Q\subset H}$$

where $$\langle\cdot,\cdot\rangle\colon \boldsymbol{E}^{\ast}\times\boldsymbol{E}\to \mathbb{R}$$ is the duality pairing and $$\boldsymbol{Q}$$ being a closed convex set.

Extra Gradient Method
In her seminal work, "The extragradient method for finding saddle points and other problems", Korpelevich was the first that described a first-order iterative optimization algorithm, so-called extragradient method for finding a solution to the Monotone Variational inequality Problem

Extragradient method which was a variation of classical  gradient ascent

In optimization gradient descent (also often called steepest descent) is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take repeated steps in the opposite direction of the gradient (or approximate gradient) of the function at the current point, because this is the direction of steepest descent. Conversely, stepping in the direction of the gradient will lead to a local maximum of that function; the procedure is then known as gradient ascent.

The first

Korpelevich's Theorem
\textit{The extragradient method} for solving the variational inequality has the form

$$\bar{x}_k=P_Q (x_k-\alpha T(x_k)), \quad x_{k+1}=P_Q(x_k-\alpha T(\bar{x}_k)) $$

where $P_Q$ is the projection on $Q$. If $T(x)$ is the gradient of a smooth convex function $f(x)$, then the variational inequality is optimality condition for minimization of $f(x)$ on $Q$ and $T(\bar{x}_k)$ is \textit{the extrapolated gradient} of  $f(x_k)$. This explains the name of the method.

\textbf{Theorem}. If $T$ satisfies Lipschitz condition on $Q$ with constant $L$ and $0<\alpha <1/L$, then $x_k\rightarrow x^*$ for $k\rightarrow\infty$.

The method can be applied for finding \textit{saddle points} and for solving \textit{matrix games} (here the method converges linearly provided the solution is unique).