User talk:Catslash/Parallel sum

In electrical engineering and mathematics, the parallel sum is a commutative, associative binary operation which derives from the formula for the resistance of resistors in parallel. The parallel sum is represented by an infix operator, which in elementary electrical texts is written as a pair of parallel lines "||"   but in the academic literature more usually as a colon ":". David Ellerman claims that the parallel sum is "just as good" as the series sum.

Motivation
Can't think of any words at the moment


 * $$R = R_{1} \parallel (R_{\mathrm{21}} + R_{\mathrm{22}}) \parallel (R_{\mathrm{31}} + (R_{\mathrm{32}} \parallel R_{\mathrm{33}}))$$


 * $$R = \frac{1}{\frac{1}{R_{1}} + \frac{1}{R_{\mathrm{21}} + R_{\mathrm{22}}} + \frac{1}{R_{\mathrm{31}} + \frac{1}{\frac{1}{R_{\mathrm{32}}} + \frac{1}{R_{\mathrm{33}}}}}}$$


 * $$R = \frac{R_{1} (R_{\mathrm{21}} + R_{\mathrm{22}}) (R_{\mathrm{31}} R_{\mathrm{33}} + R_{\mathrm{31}} R_{\mathrm{32}} + R_{\mathrm{32}} R_{\mathrm{33}})}{(R_{\mathrm{21}} + R_{\mathrm{22}} + R_{1}) (R_{\mathrm{31}} R_{\mathrm{33}} + R_{\mathrm{31}} R_{\mathrm{32}} + R_{\mathrm{32}} R_{\mathrm{33}}) + R_{1} (R_{\mathrm{21}} + R_{\mathrm{22}}) (R_{\mathrm{33}} + R_{\mathrm{32}})}$$

Definition
The parallel sum of two terms may be defined either as


 * $$a \parallel b = \frac{1}{\frac{1}{a} + \frac{1}{b}}$$

or as


 * $$a \parallel b = \frac{a b}{a + b}$$

The former definition is more readily generalized to more than two terms


 * $$a \parallel b \parallel c \parallel \mathrm{...} \parallel z = \frac{1}{\frac{1}{a} + \frac{1}{b} + \frac{1}{c} + \mathrm{...} + \frac{1}{z}}$$

but the latter definition has the advantage of remaining valid when one of the two arguments is zero.

For matrix arguments
The parallel sum of two non-singular square matrices may be defined as


 * $$\mathbf{A} \parallel \mathbf{B} = (\mathbf{A}^{- 1} + \mathbf{B}^{- 1})^{- 1}$$

This definition may be extended to singular matrices by rewriting it as


 * $$\begin{align}\mathbf{A} \parallel \mathbf{B} = \mathbf{B} (\mathbf{A} + \mathbf{B})^{- 1} \mathbf{A} \end{align}$$

provided only that the sum of the two matrices is non-singular. If the sum of the matrices is singular, then the parallel sum may still be defined by adopting a particular generalized inverse, such as the Moore-Penrose generalized inverse (Duffin). Alternatively, the two matrices may be considered to be parallel summable only when the result is the same irrespective of the choice generalized inverse


 * $$\mathbf{A} \parallel \mathbf{B} = \mathbf{A} (\mathbf{A} + \mathbf{B})^{- 1} \mathbf{B}$$

For vector arguments
Anderson and Trapp For two vectors $$\scriptstyle{\mathbf{U}}$$ and $$\scriptstyle{\mathbf{V}}$$ with non-zero $$\scriptstyle{\mathbf{U} + \mathbf{V}}$$


 * $$\mathbf{U} \parallel \mathbf{V} = \frac{\mathbf{V}^{2} \mathbf{U} + \mathbf{U}^{2} \mathbf{V}}{(\mathbf{U} + \mathbf{V})^{2}}$$

If additionally $$\scriptstyle{\mathbf{U}}$$ and $$\scriptstyle{\mathbf{V}}$$ are each non-zero, this can be written as


 * $$\mathbf{U} \parallel \mathbf{V} = \frac{\frac{\mathbf{U}}{\mathbf{U}^{2}} + \frac{\mathbf{V}}{\mathbf{V}^{2}}}{\left( \frac{\mathbf{U}}{\mathbf{U}^{2}} + \frac{\mathbf{V}}{\mathbf{V}^{2}}\right)^{2}}$$

That is, the (generalized) inverse of the vector $$\scriptstyle{\mathbf{U}}$$ is taken to be $$\scriptstyle{\mathbf{inv}(\mathbf{U}) = \frac{\mathbf{U}}{\mathbf{U}^{2}}}$$, the smallest vector satisfying $$\scriptstyle{(\mathbf{inv}(\mathbf{U}))\cdot \mathbf{U} = 1}$$.

Inequalities
Whereas the ordinary sum of any two non-negative real numbers is greater than or equal to each of them, the parallel sum of two such numbers is less than or equal to each.


 * $$\mathrm{tr}(\mathbf{A} \parallel \mathbf{B}) \le \mathrm{tr}(\mathbf{A}) \parallel \mathrm{tr}(\mathbf{B})$$


 * $$\det(\mathbf{A} \parallel \mathbf{B}) \le \det(\mathbf{A}) \parallel \det(\mathbf{B})$$


 * $$\mathrm{norm}(\mathbf{A} \parallel \mathbf{B}) \le \mathrm{norm}(\mathbf{A}) \parallel \mathrm{norm}(\mathbf{B})$$

Lehman's series-parallel inequality

 * $$(A + B) \parallel (C + D) \ge (A \parallel C) + (B \parallel D)$$

More generally, for an array of m rows in series, by n parallel columns of resistors, the overall resistance is higher if series connections are made first.


 * $$\overset{n}{\underset{j=1}{\Big|\,\Big|}} \, \sum_{i=1}^{m} R_{i,j} \, \ge \, \sum_{i=1}^{m} \overset{n}{\underset{j=1}{\Big|\,\Big|}} R_{i,j}$$

and this is likewise true when the R are positive definite square matrices. .

Parity with the 'serial' sum
In electrical engineering, both resistance -voltage drop divided by current- and conductance -current divided by voltage- are used to quantify the propensity of a material body to pass a current. Either of these quantities

In electrical engineering, the propensity of a material body to pass an electrical current may be quantified by either the body's resistance -the voltage drop across the body divided by the current through it- or by its conductance -the current divided by the voltage. In the case of

David Ellerman argues that the parallel sum is "just as good" as the series sum and offers a number of illustrations that the

duality (mathematics)

Harmonic mean

Gaussian equation for thin lenses

Geometric construction

Richard Duffin

Moore-Penrose generalized inverse

positive semi-definite just as simple to consider Hermitian semi-definite matrices.