Functional correlation

In statistics, functional correlation is a dimensionality reduction technique used to quantify the correlation and dependence between two variables when the data is functional. Several approaches have been developed to quantify the relation between two functional variables.

Overview
A pair of real valued random functions $$ \textstyle X(t)$$ and  $$ \textstyle Y(t)$$ with $$ t \in T$$, a compact interval, can be viewed as realizations of square-integrable stochastic process in a Hilbert space. Since both $$ X $$ and $$ Y $$are infinite dimensional, some kind of dimension reduction is required to explore their relationship. Notions of correlation for functional data include the following.

Functional canonical correlation coefficient (FCCA)
FCCA is a direct extension of multivariate canonical correlation. For a pair of random functions $$ X \in \mathcal{L}^2(\mathcal{I_X}) $$ and $$ Y \in \mathcal{L}^2(\mathcal{I_Y}) $$ the first canonical coefficient $$ \rho_1 $$ is defined as:

where $$\langle \cdot,\cdot\rangle$$ denotes the inner product in Lp space (p=2) i.e. $$\langle f_1,f_2\rangle = \int_{\mathcal{I}} f_1(t) f_2(t) \, dt, $$ $$ \quad f_1, f_2 \in \mathcal{L}^2(\mathcal{I}) $$

The $$ k^{th}$$ canonical coefficient $$ \rho_k $$, given $$\rho_1, \rho_2, \ldots, \rho_{k-1} $$ is defined as:

where $$ (U_k, V_k)=(\langle u_k, X\rangle, \langle v_k, Y\rangle )$$ is uncorrelated with all previous pairs $$(U_j, V_j)=(\langle u_j, X\rangle , \langle v_j, Y\rangle )_{j=1,2,\ldots,k-1} $$.

Thus FCCA implements projections in the directions of $$ U_k $$ and $$ V_k $$ for $$ X $$ and $$ Y $$ respectively, such that their linear combinations (inner products) $$ (U_k, V_k) $$ are maximally correlated. $$ X $$ and $$ Y $$ are uncorrelated if all their canonical correlations are zero, equivalently, if and only if $$ \rho_1 =0 $$.

Alternative formulation
The cross-covariance operator for two random functions $$ X $$ and $$ Y $$ defined as $$ \Sigma_{XY} : \mathcal{L}^2(\mathcal{I_X}) \rightarrow \mathcal{L}^2(\mathcal{I_Y}) : \Sigma_{XY} v(t) = \int \operatorname{cov}( X(t), Y(t)) v(s) \, ds ;\quad v \in \mathcal{L}^2(\mathcal{I_Y}) $$ and analogously the auto covariance operators for $$ X,\Sigma_{XX} $$, for $$ Y,\Sigma_{YY} $$ and using $$ \operatorname{cov} (\langle u, X\rangle, \langle v, Y\rangle ) = \langle u, \Sigma_{XY}Y \rangle $$, the $$ k^{th}$$ canonical coefficient $$ \rho_k $$ in (2) can be re-written as, , where $$ (U_k, V_k)=(\langle u_k, X\rangle, \langle v_k, Y\rangle )$$ is uncorrelated with all previous pairs $$(U_j, V_j)=(\langle u_j, X\rangle , \langle v_j, Y\rangle )_{j=1,2,..,k-1} $$

Maximizing (3) is equivalent to finding eigenvalues and eigenvectors of the operator $$ R= \Sigma^{-1/2}_{XX} \Sigma_{XY} \Sigma^{-1/2}_{YY} $$.

Challenges
Since $$ \Sigma_{XX} $$ and $$ \Sigma_{YY} $$ are compact operators, the square root of the auto-covariance operator of $$ \mathcal{L^2}  $$ processes may not be invertible. So the existence of $$ R $$ and hence computing its eigenvalues and eigenvectors is an ill-posed problem. As a consequence of this inverse problem, overfitting may occur which may lead to an unstable correlation coefficient. Due to this inverse problem, $$ \rho_1$$ tends to be biased upwards and therefore close to 1 and hence is difficult to interpret. FCCA also requires densely recorded functional data so that the inner products in (2) can be accurately evaluated.

Possible solutions
Some possible solutions to this problem have been discussed.


 * By restricting the maximization of (1) to discrete $$ l^2 $$ sequence spaces that are restricted to a reproducing kernel Hilbert space instead of entire $$\mathcal{L}^2$$
 * Using cross-validation to regularize the FCCA in practical implementation.

Functional singular correlation analysis (FSCA)
FSCA bypasses the inverse problem by simply replacing the objective function by covariance in place of correlation in (2). FSCA aims to quantify the dependency of $$ X,Y$$ by implementing the concept of functional singular-value decomposition for the cross-covariance operator. FSCA can be viewed as an extension of analyses using singular-value decomposition of vector data to functional data. For a pair of random functions $$ X \in \mathcal{L}^2(\mathcal{I_X}) $$ and $$ Y \in \mathcal{L}^2(\mathcal{I_Y}) $$ with smooth mean functions $$ \mu_X(t)=\mathbb{E}(X(t))$$ and $$ \mu_Y(t)=\mathbb{E}(Y(t))$$ and smooth covariance functions, FSCA aims at a "functional covariance" corresponding to the first singular value of the cross-covariance operator $$\Sigma_{XY}$$,

which is attained at functions $$u_1 \in \mathcal{L}^2(\mathcal{I_X}), v_1 \in \mathcal{L}^2(\mathcal{I_Y})$$. A standardized version of this serves as a functional correlation and is defined as , The singular representation of the cross-covariance can be employed to find a solution to the maximization problem (4). Analogously, we can extend this concept to find the next $$k $$ ordered singular correlation coefficients $$ \rho_1, \rho_2,\ldots,\rho_k $$.

Correlation as angle between functions
In the multivariate case, the inner product of two vectors $$a $$ and $$b $$ is defined as, $$ \langle a,b \rangle = \|a\| \|b\| \cos\alpha$$ where $$ \alpha $$ is the angle between $$a $$ and $$b $$. This can be extended to the space of square integrable random functions. For this notion to be a meaningful measure of alignment of shapes, the integrals of the functions, which are the projections on the constant function 1, are subtracted. This part corresponds to a "static part" and the remainder can be thought of as a "dynamic part" for each random function. The cosine of the $$L^2$$ angle between these "dynamic parts" then provides a correlation measure of functional shapes. Denoting $$ M_1=\langle X,1 \rangle$$ and $$ M_2=\langle Y,1 \rangle$$ the standardized curves may be defined as $$ X^*(t)=\left(X(t)-M_1(t)\right) \Big / \left(\int (X(t)-M_1(t))^2 \, dt\right)^{1/2}$$ $$ Y^*(t)=\left(Y(t)-M_2(t)\right) \Big / \left(\int (Y(t)-M_2(t))^2 \,dt\right)^{1/2}$$ and the correlation is defined as,