Elias omega coding

Elias ω coding or Elias omega coding is a universal code encoding the positive integers developed by Peter Elias. Like Elias gamma coding and Elias delta coding, it works by prefixing the positive integer with a representation of its order of magnitude in a universal code. Unlike those other two codes, however, Elias omega recursively encodes that prefix; thus, they are sometimes known as recursive Elias codes.

Omega coding is used in applications where the largest encoded value is not known ahead of time, or to compress data in which small values are much more frequent than large values.

To encode a positive integer N:
 * 1) Place a "0" at the end of the code.
 * 2) If N = 1, stop; encoding is complete.
 * 3) Prepend the binary representation of N to the beginning of the code. This will be at least two bits, the first bit of which is a 1.
 * 4) Let N equal the number of bits just prepended, minus one.
 * 5) Return to Step 2 to prepend the encoding of the new N.

To decode an Elias omega-encoded positive integer:
 * 1) Start with a variable N, set to a value of 1.
 * 2) If the next bit is a "0" then stop. The decoded number is N.
 * 3) If the next bit is a "1" then read it plus N more bits, and use that binary number as the new value of N. Go back to Step 2.

Examples
Omega codes can be thought of as a number of "groups". A group is either a single 0 bit, which terminates the code, or two or more bits beginning with 1, which is followed by another group.

The first few codes are shown below. Included is the so-called implied distribution, describing the distribution of values for which this coding yields a minimum-size code; see Relationship of universal codes to practical compression for details.

The encoding for 1 googol, 10100, is 11 1000 101001100 (15 bits of length header) followed by the 333-bit binary representation of 1 googol, which is 10010 01001001 10101101 00100101 10010100 11000011 01111100 11101011 00001011 00100111 10000100 11000100 11001110 00001011 11110011 10001010 11001110 01000000 10001110 00100001 00011010 01111100 10101010 10110010 01000011 00001000 10101000 00101110 10001111 00010000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 and a trailing 0, for a total of 349 bits.

A googol to the hundredth power (1010000) is a 33,220-bit binary number. Its omega encoding is 33,243 bits long: 11 1111 1000000111000100 (22 bits), followed by 33,220 bits of the value, and a trailing 0. Under Elias delta coding, the same number is 33,250 bits long: 000000000000000 1000000111000100 (31 bits) followed by 33,219 bits of the value. The omega and delta coding are, respectively, 0.07% and 0.09% longer than the ordinary 33,220-bit binary representation of the number.

Code length
For the encoding of a positive integer $N$, the number of bits needed, $B(N)$, is recursively:$$ \begin{align}B(0) & = 0\,, \\ B(N) & = 1 + \lfloor \log_2(N) \rfloor + B(\lfloor \log_2(N) \rfloor)\,. \end{align}$$That is, the length of the Elias omega code for the integer $$N$$ is$$1 + (1 + \lfloor \log_2 N \rfloor{}) + (1 + \lfloor \log_2 \lfloor \log_2 N \rfloor{} \rfloor{}) + \cdots $$where the number of terms in the sum is bounded above by the binary iterated logarithm. To be precise, let $$f(x) = \lfloor \log_2 x \rfloor{} $$. We have $$N > f(N) > f(f(N)) > \cdots > f^k(N) = 1 $$ for some $$k $$, and the length of the code is $$(k+1) + f(N) + f^2(N) + \dots + f^k(N) $$. Since $$f(x) \leq \log_2 x $$, we have $$k \leq \log_2^*(N) $$.

Since the iterated logarithm grows slower than all $$\log_2^n N $$ for any fixed $$n$$, the asymptotic growth rate is $$\Theta(\log_2 N + \log_2^2 N + \cdots) $$, where the sum terminates when it drops below one.

Asymptotic optimality
Elias omega coding is an asymptotically optimal prefix code.

Proof sketch. A prefix code must satisfy the Kraft inequality. For the Elias omega coding, the Kraft inequality states$$\sum_{n=1}^\infty \frac{1}{2^{O(1) + \log_2 n + \log_2\log_2 n + \cdots}} \leq 1 $$Now, the summation is asymptotically the same as an integral, giving us$$\int_1^\infty \frac{dx}{x \times \ln x \times \ln \ln x \cdots} \leq O(1) $$If the denominator terminates at some point $$\ln^k x$$, then the integral diverges as $$\ln^{k+1} \infty$$. However, if the denominator terminates at some point $$(\ln^k x)^{1+\epsilon}$$, then the integral converges as $$\frac{1}{(\ln^{k} \infty)^\epsilon} $$. The Elias omega code is on the edge between diverging and converging.

Generalizations
Elias omega coding does not encode zero or negative integers. One way to encode all non-negative integers is to add 1 before encoding and then subtract 1 after decoding, or use the very similar Levenshtein coding. One way to encode all integers is to set up a bijection, mapping all integers (0, 1, -1, 2, -2, 3, -3, ...) to strictly positive integers (1, 2, 3, 4, 5, 6, 7, ...) before encoding.