User:Andy Holland2/sandbox

Introduction
The Functional Abacus is a mathematical system where quantized tally functions are used to inter-relate movable coefficients of probability distribution or Density Distribution Functions. This simple system was inspired by the Psalm of Moses and its liturgical notions of time which enables quantized finite deterministic mathematics in continuum.

The Functional Abacus was invented as a mathematical basis of particle transport algorithms to supplant the need to use the Monte Carlo Method; particularly in safety critical nuclear applications. As with Monte Carlo, the Functional Abacus can be used to model information, quantized and classical particles, currency, optimization objective functions, forces, fields, sound waves, traffic etc...

The Functional Abacus can provide analytic, closed form and numerical methods solutions to problems previously handled by Monte Carlo. It can also work with Monte Carlo. While the computer software methodology is patented, the pure mathematics associated with this methodology is freely available to all and is first being published in Wikipedia in recognition of their contributions to this method.

Abacus Series
The functional abacus is based upon a fundamental series called the Abacus Series. Functions within finite regions such as $$[x] \overset{\underset{\mathrm{def}}{}}{=} x\rightarrow x+\Delta x$$ where $$\Delta x\ne 0$$ are described within finite bounds. Unlike the Calculus with the foundational Taylor Series, inter-relationship of functions is handled with independent tally functions outside of regions such as $$[x]$$. Tally functions can be deterministic over a continuous field $$x_c$$, however distribution functions operate within defined quantized sub-domains $$[x]$$, which is considered the observable finite domain. Simple quantized wave forms provide common reference between the continuity variable, the observable and form the referential basis for tally functions and the Abacus series that expresses a probability density.

A density function is described, for example in one dimension, with a wave form series such as:

$$n(x)|_{[x]}=a_0[x]+\sum_{i=1}^n a_i[x]w_i(x_c,x)$$

The canonical polynomial Abacus Series is:

$$n(x)|_{[x]}=a_0[x]+\sum_{i=1}^n [a_i[x]\prod_{j=1}^m (x_c-r_{i,j})]$$

where the roots $$r_{i,j}$$ are those of the Legendre Polynomials remapped over the finite interval $$[x]$$. This yields, for the canonical polynomial one dimensional version:

$$n(x)|_{[x]}=a_0[x]+a_1[x](x_c-x-\frac{\Delta x}{2})+a_2[x](x_c-x-\frac{\Delta x}{2}(1+\sqrt{\frac{1}{3}}))(x_c-x-\frac{\Delta x}{2}(1-\sqrt{\frac{1}{3}}))+... $$

The coefficients can be transformed termwise for each wave form $$w_i(x_c,x)$$ from a regular function $$f(x_c)$$using:

$$a_i[x]=\frac{\int_{[x]}f(x_c)w_i(x_c,x)dx_c}{\int_{[x]}w_i^2(x_c,x)dx_c}$$

It is easily verified that the Abacus Series converges to the Taylor Series with associated factorial constant in the limit as $$\Delta x \rightarrow 0$$ where the interval $$[x]$$ collapses to $$(x_c)$$. Multivariate forms and functions are possible and obvious. For example one can have distribution coefficients of many variables such as $$a_{i,j,k,g,m}[x_i,y_j,z_k,E_g,\Omega_m]$$ representing the three spatial dimensions, energy bins and solid angle bins.

Tally Functions
With this definition of abacus tally coefficients $$a_i[x]$$ relating to quantized wave forms $$w_i(x_c,x)$$, we use tally functions that can be determined exactly with respect to an underlying continuum.

For example, we express this continuum in terms of a pass through probability $$m(x_c)$$ in one dimensions. In a vacuum $$m(x_c)=1$$ while for X-Ray particle transport $$m(x_c)=e^{-\frac{\mu x_c}{\rho}} $$. With this notion, consider a simple tally function along a ray from $$[x_1]$$ to the point $$(x_2)$$:

$$T_i[x_1\rightarrow x_2) = \int_{[x_1]}m(x_c)w_i(x_c,x_1)dx_c \cdot \frac{\int_{x_1+\Delta x_1}^{x_2} m(x_c)dx_c}{x_2-(x_1+\Delta x_1)}$$

The full tally function that transmits Abacus Series coefficients from one finite region and wave form to another is:

$$T_{i\rightarrow j}[x_1\rightarrow x_2] = T_i[x_1\rightarrow x_2) \cdot \frac{\int_{[x_2]}(1-m(x_c))w_j(x_c,x_2)dx_c}{\int_{[x_2]}w_j^2(x_c,x_2)dx_c}$$

Coefficient Movement
 'teach us to number our days aright that we may gain a heart of wisdom' 

The tally function operates in a finite step wherein one can collect all like events in a sweep method. The finite time epoch for a logical step for use of a tally function is $$\epsilon_l$$, and from one step to another we can go to state $$\epsilon_{l+1}$$. In simple particle transport, each step represents the number of times grouped like particles interact 0,1,2,3,...n times. The movement of distributions over finite regions $$r$$ in such a logical gathered time step is accomplished using:

$$a_j[x,\epsilon_{l+1}] \leftarrow \sum_r\sum_i a_i[x_r,\epsilon_l]T_{i\rightarrow j}[x_r\rightarrow x]$$

Numerical Methods
A sophisticated network is used in multivariate systems to perform particle transport in three dimensions with energy and angle dependence. For approximations of tally functions and for truncation purposes in practical calculations, one can select interval bins intelligently and four coefficients for near cubic spline continuity approximation for coefficient functions. With sixteen multiplies and adds, one move coefficients representing one near continuity function to another state in a step with a sweep through all initial regions. This effectively removes the need for a Monte Carlo random number statistical approach, even for the most complex anisotropic scattering particle transport simulations.

As $$m(x_c)$$ can represent any probability field in continuity with respect to the inter-relationship of functions, the Abacus can provide, even in approximate numerical methods, near deterministic results in near continuity. Tally functions naturally handle discontinuities as well.

The Delta Function
Given an infinite potential wall that particle interacts with, we have for the $$x_2$$ region $$m(x_c)|_{[x_2]}=0$$. From any initial position, without intervening interaction, any individual particle will be distributed at the point of interaction in accordance with: $$a_i[x_2]\delta^i(x_2)\leftarrow a_i[x_r]T_i[x_r\rightarrow x_2)$$

Here the infinity of Parseval's identity has been replaced with and infinite wall interaction potential - forcing interaction at a point. The delta function is 'proved' in tally terms in the limit as one goes to zero rather than infinity, so that $$[x_2]\rightarrow (x_2)$$.

Finite Determinism
Finite determinism is remarkably simple to achieve by employing Functional Abacus thinking.

If the inter-relationship of functions are quantized, and all sub-domains are quantized for variables, then one can have a deterministic probability mathematical system by simply moving coefficients associated with quantized wave forms and tally functions - as with an Abacus.

In the Calculus, this would seem to violate the Uncertainty Principle. However, in the Calculus one is using a relative rate system that inter-relates functions through vectors on a point to point basis. Points are infinite, and in a finite universe referential except with respect to field continuity potential. If one has a finite quantized system of wave forms and tally functions, one can achieve or closely approximate continuity without statistics. One can always compute probabilities without having to deal with statistics. Rolling dice may in fact be deterministic with respect to initial conditions, though it seems in our current limited mathematical context to be "statistical".

The Calculus is a product of reductionism in science. While valuable and beautiful, it is fundamentally limited in its ability to describe creation. It needs a finite deterministic partner, along with a statistical notion to complete a physical modeling of Creation.

Finite determinism in creation follows Classical Christian theology. Creation is finite within an eternally begotten logos - infinite yet begotten, not made and one in essence with the Creator and co-eternal. One with the very Holy Spirit of Truth. The material universe is vast, yet finite, and we are called to reduce ourselves to nothing to enter infinity while seeing all around us as irreducible and of infinite worth.

"My God does not play with dice"
The universe is finite in time, therefore in space. It follows then while fields may be infinitesimal, constuent matter and waves are fundamentally finite. After all, $$0=\frac{1}{\infty}$$. If there is no big infinity, why should Creation admit a small one in the material sense with regard to particles which all have de Broglie wave length?

One can compose pre-convoluted Quantized Tally functions of the form e.g.: $$T_{i,j,k,g,m,t}[x_i,y_j,z_k,E_g,\Omega_m]$$

The wave forms associated with these can have pre-convoluted complex terms of form e.g: $$w_{i,j,k,g,m}(x_c,x,y_c,y,z_c,z,E_c,E,\Omega_c,\Omega)$$

A coefficient transmitted, including the first average functional coefficient, always transmits values and information in a mathematical sense as a particle even though it may be associated with a wave. In the tally perspective, particles may be waves but as they interact in finite space, they must interact in tally mathematics as a particle - they have a definite "punch" in terms of the first average value. This is related to the photo-electric effect.

As such, in an Abacus system one has finite determinism and it appears the Uncertainty Principle is fundamentally circumvented. If one formulates a complex tally function, cross product inter-relationship gaurentees determinism in the extreme. It appears Einstein's outlook is again proved correct, and it appears that his projection functions, albeit eigenvalue time based, were on a correct path for his unified theory of Gravity and Quantum Mechanics.