Algebraic signal processing

Algebraic signal processing (ASP) is an emerging area of theoretical signal processing (SP). In the algebraic theory of signal processing, a set of filters is treated as an (abstract) algebra, a set of signals is treated as a module or vector space, and convolution is treated as an algebra representation. The advantage of algebraic signal processing is its generality and portability.

History
In the original formulation of algebraic signal processing by Puschel and Moura, the signals are collected in an $$\mathcal{A} $$-module for some algebra $$\mathcal{A} $$ of filters, and filtering is given by the action of $$\mathcal{A} $$ on the $$\mathcal{A} $$-module.

Definitions
Let $$K$$ be a field, for instance the complex numbers, and $$\mathcal{A}$$ be a $$K$$-algebra (i.e. a vector space over $$K$$ with a binary operation $$\ast: \mathcal{A} \otimes \mathcal{A} \to \mathcal{A}$$ that is linear in both arguments) treated as a set of filters. Suppose $$\mathcal{M}$$ is a vector space representing a set signals. A representation of $$\mathcal{A}$$ consists of an algebra homomorphism $$\rho: \mathcal{A} \to \mathrm{End}(\mathcal{M})$$ where $$\mathrm{End}(\mathcal{M})$$ is the algebra of linear transformations $$T: \mathcal{M} \to \mathcal{M}$$ with composition (equivalent, in the finite-dimensional case, to matrix multiplication). For convenience, we write $$\rho_{a}$$ for the endomorphism $$\rho(a)$$. To be an algebra homomorphism, $$\rho$$ must not only be a linear transformation, but also satisfy the property$$\rho_{a \ast b} = \rho_{b}\circ\rho_a \quad \forall a, b \in \mathcal{A}$$Given a signal $$x \in \mathcal{M}$$, convolution of the signal by a filter $$a \in \mathcal{A}$$ yields a new signal $$\rho_a(x)$$. Some additional terminology is needed from the representation theory of algebras. A subset $$\mathcal{G} \subseteq \mathcal{A}$$ is said to generate the algebra if every element of $$\mathcal{A}$$ can be represented as polynomials in the elements of $$\mathcal{A}$$. The image of a generator $$g \in \mathcal{G}$$ is called a shift operator. In all practically all examples, convolutions are formed as polynomials in $$\mathrm{End}(\mathcal{M})$$ generated by shift operators. However, this is not necessarily the case for a representation of an arbitrary algebra.

Discrete Signal Processing
In discrete signal processing (DSP), the signal space is the set of complex-valued functions $$\mathcal{M} = \mathcal{L}^2(\mathbb{Z})$$ with bounded energy (i.e. square-integrable functions). This means the infinite series $$\sum_{n = -\infty}^{\infty} |(x)_n| < \infty$$ where $$|\cdot|$$ is the modulus of a complex number. The shift operator is given by the linear endomorphism $$(S x)_n = (x)_{n-1}$$. The filter space is the algebra of polynomials with complex coefficients $$\mathcal{A} = \mathbb{C}[z^{-1},z]$$ and convolution is given by $$\rho_{h} = \sum_{k = -\infty}^{\infty} h_k S^k$$ where $$h(t) = \sum_{k=-\infty}^{\infty} h_k z^k$$ is an element of the algebra. Filtering a signal by $$h$$, then yields $$(y)_n = \sum_{k=-\infty}^{\infty} h_k x_{n-k}$$ because $$(S^k x)_n = (x)_{n-k}$$.

Graph Signal Processing
A weighted graph is an undirected graph $$\mathcal{G} = (\mathcal{V}, \mathcal{E})$$ with pseudometric on the node set $$\mathcal{V}$$ written $$a_{ij}$$. A graph signal is simply a real-valued function on the set of nodes of the graph. In graph neural networks, graph signals are sometimes called features. The signal space is the set of all graph signals $$\mathcal{M} = \R^\mathcal{V}$$ where $$\mathcal{V}$$ is a set of $$n = |\mathcal{V}| $$ nodes in $$\mathcal{G} = (\mathcal{V}, \mathcal{E})$$. The filter algebra is the algebra of polynomials in one indeterminate $$\mathcal{A} = \mathbb{R}[t]$$. There a few possible choices for a graph shift operator (GSO). The (un)normalized weighted adjacency matrix of $$[A]_{ij} = a_{ij}$$ is a popular choice, as well as the (un)normalized graph Laplacian $$[L]_{ij} = \begin{cases} \sum_{j = 1}^{n}a_{ij} & i = j \\ -a_{ij} & i \neq j \end{cases} $$. The choice is dependent on performance and design considerations. If $$S $$ is the GSO, then a graph convolution is the linear transformation $$\rho_h = \sum_{k=0}^{\infty} h_k S^k $$ for some $$h(t) = \sum_{k=0}^{\infty} h_k z^k$$, and convolution of a graph signal $$\mathbf{x}: \mathcal{V} \to \mathbb{R} $$ by a filter $$h(t) $$ yields a new graph signal $$\mathbf{y} = \left(\sum_{k=0}^{\infty} h_k S^k \right) \cdot \mathbf{x} $$.

Other Examples
Other mathematical objects with their own proposed signal-processing frameworks are algebraic signal models. These objects include including quivers, graphons, semilattices, finite groups, and Lie groups, and others.

Intertwining Maps
In the framework of representation theory, relationships between two representations of the same algebra are described with intertwining maps which in the context of signal processing translates to transformations of signals that respect the algebra structure. Suppose $$\rho: \mathcal{A} \to \mathrm{End}(\mathcal{M})$$ and $$\rho': \mathcal{A} \to \mathrm{End}(\mathcal{M}')$$ are two different representations of $$\mathcal{A}$$. An intertwining map is a linear transformation $$\alpha: \mathcal{M} \to \mathcal{M}'$$ such that

$$\alpha \circ \rho_a = \rho'_a \circ \alpha \quad \forall a \in \mathcal{A}$$

Intuitively, this means that filtering a signal by $$a$$ then transforming it with $$\alpha$$ is equivalent to first transforming a signal with $$\alpha $$, then filtering by $$a$$. The z transform is a prototypical example of an intertwining map.

Algebraic Neural Networks
Inspired by a recent perspective that popular graph neural networks (GNNs) architectures are in fact convolutional neural networks (CNNs), recent work has been focused on developing novel neural network architectures from the algebraic point-of-view. An algebraic neural network is a composition of algebraic convolutions, possibly with multiple features and feature aggregations, and nonlinearities.