Cryptanalysis of the Lorenz cipher

Cryptanalysis of the Lorenz cipher was the process that enabled the British to read high-level German army messages during World War II. The British Government Code and Cypher School (GC&CS) at Bletchley Park decrypted many communications between the Oberkommando der Wehrmacht (OKW, German High Command) in Berlin and their army commands throughout occupied Europe, some of which were signed "Adolf Hitler, Führer". These were intercepted non-Morse radio transmissions that had been enciphered by the Lorenz SZ teleprinter rotor stream cipher attachments. Decrypts of this traffic became an important source of "Ultra" intelligence, which contributed significantly to Allied victory.

For its high-level secret messages, the German armed services enciphered each character using various online Geheimschreiber (secret writer) stream cipher machines at both ends of a telegraph link using the 5-bit International Telegraphy Alphabet No. 2 (ITA2). These machines were subsequently discovered to be the Lorenz SZ (SZ for Schlüssel-Zusatz, meaning "cipher attachment") for the army, the Siemens and Halske T52 for the air force and the Siemens T43, which was little used and never broken by the Allies.

Bletchley Park decrypts of messages enciphered with the Enigma machines revealed that the Germans called one of their wireless teleprinter transmission systems "Sägefisch" (sawfish), which led British cryptographers to refer to encrypted German radiotelegraphic traffic as "Fish". "Tunny" (tunafish) was the name given to the first non-Morse link, and it was subsequently used for the cipher machines and their traffic.

As with the entirely separate cryptanalysis of the Enigma, it was German operational shortcomings that allowed the initial diagnosis of the system, and a way into decryption. Unlike Enigma, no physical machine reached allied hands until the very end of the war in Europe, long after wholesale decryption had been established. The problems of decrypting Tunny messages led to the development of "Colossus", the world's first electronic, programmable digital computer, ten of which were in use by the end of the war, by which time some 90% of selected Tunny messages were being decrypted at Bletchley Park.

Albert W. Small, a cryptanalyst from the US Army Signal Corps who was seconded to Bletchley Park and worked on Tunny, said in his December 1944 report back to Arlington Hall that: "Daily solutions of Fish messages at GC&CS reflect a background of British mathematical genius, superb engineering ability, and solid common sense. Each of these has been a necessary factor. Each could have been overemphasised or underemphasised to the detriment of the solutions; a remarkable fact is that the fusion of the elements has been apparently in perfect proportion. The result is an outstanding contribution to cryptanalytic science."

German Tunny machines


The Lorenz SZ cipher attachments implemented a Vernam stream cipher, using a complex array of twelve wheels that delivered what should have been a cryptographically secure pseudorandom number as a key stream. The key stream was combined with the plaintext to produce the ciphertext at the transmitting end using the exclusive or (XOR) function. At the receiving end, an identically configured machine produced the same key stream which was combined with the ciphertext to produce the plaintext, i. e. the system implemented a symmetric-key algorithm.

The key stream was generated by ten of the twelve wheels. This was a product of XOR-ing the 5-bit character generated by the right hand five wheels, the chi ($$\chi$$) wheels, and the left hand five, the psi ($$\psi$$) wheels. The chi wheels always moved on one position for every incoming ciphertext character, but the psi wheels did not.

The central two mu ($$\mu$$) or "motor" wheels determined whether or not the psi wheels rotated with a new character. After each letter was enciphered either all five psi wheels moved on, or they remained still and the same letter of psi-key was used again. Like the chi wheels, the $$\mu$$61 wheel moved on after each character. When $$\mu$$61 had the cam in the active position and so generated x (before moving) $$\mu$$37 moved on once: when the cam was in the inactive position (before moving) $$\mu$$37 and the psi wheels stayed still. On all but the earliest machines, there was an additional factor that played into the moving on or not of the psi wheels. These were of four different types and were called "Limitations" at Bletchley Park. All involved some aspect of the previous positions of the machine's wheels.

The numbers of cams on the set of twelve wheels of the SZ42 machines totalled 501 and were co-prime with each other, giving an extremely long period before the key sequence repeated. Each cam could either be in a raised position, in which case it contributed x to the logic of the system, reversing the value of a bit, or in the lowered position, in which case it generated •. The total possible number of patterns of raised cams was 2501 which is an astronomically large number. In practice, however, about half of the cams on each wheel were in the raised position. Later, the Germans realized that if the number of raised cams was not very close to 50% there would be runs of xs and •s, a cryptographic weakness.

The process of working out which of the 501 cams were in the raised position was called "wheel breaking" at Bletchley Park. Deriving the start positions of the wheels for a particular transmission was termed "wheel setting" or simply "setting". The fact that the psi wheels all moved together, but not with every input character, was a major weakness of the machines that contributed to British cryptanalytical success.

Secure telegraphy
Electro-mechanical telegraphy was developed in the 1830s and 1840s, well before telephony, and operated worldwide by the time of the Second World War. An extensive system of cables linked sites within and between countries, with a standard voltage of −80 V indicating a "mark" and +80 V indicating a "space". Where cable transmission became impracticable or inconvenient, such as for mobile German Army Units, radio transmission was used.

Teleprinters at each end of the circuit consisted of a keyboard and a printing mechanism, and very often a five-hole perforated paper-tape reading and punching mechanism. When used online, pressing an alphabet key at the transmitting end caused the relevant character to print at the receiving end. Commonly, however, the communication system involved the transmitting operator preparing a set of messages offline by punching them onto paper tape, and then going online only for the transmission of the messages recorded on the tape. The system would typically send some ten characters per second, and so occupy the line or the radio channel for a shorter period of time than for online typing.

The characters of the message were represented by the codes of the International Telegraphy Alphabet No. 2 (ITA2). The transmission medium, either wire or radio, used asynchronous serial communication with each character signaled by a start (space) impulse, 5 data impulses and 1½ stop (mark) impulses. At Bletchley Park mark impulses were signified by x ("cross") and space impulses by • ("dot"). For example, the letter "H" would be coded as ••x•x.

The figure shift (FIGS) and letter shift (LETRS) characters determined how the receiving end interpreted the string of characters up to the next shift character. Because of the danger of a shift character being corrupted, some operators would type a pair of shift characters when changing from letters to numbers or vice versa. So they would type 55M88 to represent a full stop. Such doubling of characters was very helpful for the statistical cryptanalysis used at Bletchley Park. After encipherment, shift characters had no special meaning.

The speed of transmission of a radio-telegraph message was three or four times that of Morse code and a human listener could not interpret it. A standard teleprinter, however would produce the text of the message. The Lorenz cipher attachment changed the plaintext of the message into ciphertext that was uninterpretable to those without an identical machine identically set up. This was the challenge faced by the Bletchley Park codebreakers.

Interception
Intercepting Tunny transmissions presented substantial problems. As the transmitters were directional, most of the signals were quite weak at receivers in Britain. Furthermore, there were some 25 different frequencies used for these transmissions, and the frequency would sometimes be changed part way through. After the initial discovery of the non-Morse signals in 1940, a radio intercept station called the Foreign Office Research and Development Establishment was set up on a hill at Ivy Farm  at Knockholt in Kent, specifically to intercept this traffic. The centre was headed by Harold Kenworthy, had 30 receiving sets and employed some 600 staff. It became fully operational early in 1943. Because a single missed or corrupted character could make decryption impossible, the greatest accuracy was required. The undulator technology used to record the impulses had originally been developed for high-speed Morse. It produced a visible record of the impulses on narrow paper tape. This was then read by people employed as "slip readers" who interpreted the peaks and troughs as the marks and spaces of ITA2 characters. Perforated paper tape was then produced for telegraphic transmission to Bletchley Park where it was punched out.

The Vernam cipher
The Vernam cipher implemented by the Lorenz SZ machines utilizes the Boolean "exclusive or" (XOR) function, symbolised by ⊕ and verbalised as "A or B but not both". This is represented by the following truth table, where x represents "true" and • represents "false". Other names for this function are: exclusive disjunction, not equal (NEQ), and modulo 2 addition (without "carry") and subtraction (without "borrow"). Modulo 2 addition and subtraction are identical. Some descriptions of Tunny decryption refer to addition and some to differencing, i.e. subtraction, but they mean the same thing. The XOR operator is both associative and commutative.

Reciprocity is a desirable feature of a machine cipher so that the same machine with the same settings can be used either for enciphering or for deciphering. The Vernam cipher achieves this, as combining the stream of plaintext characters with the key stream produces the ciphertext, and combining the same key with the ciphertext regenerates the plaintext.

Symbolically:


 * Plaintext ⊕ Key = Ciphertext

and


 * Ciphertext ⊕ Key = Plaintext

Vernam's original idea was to use conventional telegraphy practice, with a paper tape of the plaintext combined with a paper tape of the key at the transmitting end, and an identical key tape combined with the ciphertext signal at the receiving end. Each pair of key tapes would have been unique (a one-time tape), but generating and distributing such tapes presented considerable practical difficulties. In the 1920s four men in different countries invented rotor Vernam cipher machines to produce a key stream to act instead of a key tape. The Lorenz SZ40/42 was one of these.

Security features
A monoalphabetic substitution cipher such as the Caesar cipher can easily be broken, given a reasonable amount of ciphertext. This is achieved by frequency analysis of the different letters of the ciphertext, and comparing the result with the known letter frequency distribution of the plaintext.

With a polyalphabetic cipher, there is a different substitution alphabet for each successive character. So a frequency analysis shows an approximately uniform distribution, such as that obtained from a (pseudo) random number generator. However, because one set of Lorenz wheels turned with every character while the other did not, the machine did not disguise the pattern in the use of adjacent characters in the German plaintext. Alan Turing discovered this weakness and invented the differencing technique described below to exploit it.

The pattern of which of the cams were in the raised position, and which in the lowered position was changed daily on the motor wheels ($$\mu$$37 and $$\mu$$61). The chi wheel cam patterns were initially changed monthly. The psi wheel patterns were changed quarterly until October 1942 when the frequency was increased to monthly, and then to daily on 1 August 1944, when the frequency of changing the chi wheel patterns was also changed to daily.

The number of start positions of the wheels was 43×47×51×53×59×37×61×41×31×29×26×23 which is approximately 1.6×1019 (16 billion billion), far too large a number for cryptanalysts to try an exhaustive "brute-force attack". Sometimes the Lorenz operators disobeyed instructions and two messages were transmitted with the same start positions, a phenomenon termed a "depth". The method by which the transmitting operator told the receiving operator the wheel settings that he had chosen for the message which he was about to transmit was termed the "indicator" at Bletchley Park.

In August 1942, the formulaic starts to the messages, which were useful to cryptanalysts, were replaced by some irrelevant text, which made identifying the true message somewhat harder. This new material was dubbed quatsch (German for "nonsense") at Bletchley Park.

During the phase of the experimental transmissions, the indicator consisted of twelve German forenames, the initial letters of which indicated the position to which the operators turned the twelve wheels. As well as showing when two transmissions were fully in depth, it also allowed the identification of partial depths where two indicators differed only in one or two wheel positions. From October 1942 the indicator system changed to the sending operator transmitting the unenciphered letters QEP followed by a two digit number. This number was taken serially from a code book that had been issued to both operators and gave, for each QEP number, the settings of the twelve wheels. The books were replaced when they had been used up, but between replacements, complete depths could be identified by the re-use of a QEP number on a particular Tunny link.

Diagnosis
The first step in breaking a new cipher is to diagnose the logic of the processes of encryption and decryption. In the case of a machine cipher such as Tunny, this entailed establishing the logical structure and hence functioning of the machine. This was achieved without the benefit of seeing a machine—which only happened in 1945, shortly before the allied victory in Europe. The enciphering system was very good at ensuring that the ciphertext $P$ contained no statistical, periodic or linguistic characteristics to distinguish it from random. However this did not apply to $K$, $&chi;$, $&psi;$ and $&psi;'$, which was the weakness that meant that Tunny keys could be solved.

During the experimental period of Tunny transmissions when the twelve-letter indicator system was in use, John Tiltman, Bletchley Park's veteran and remarkably gifted cryptanalyst, studied the Tunny ciphertexts and identified that they used a Vernam cipher.

When two transmissions (a and b) use the same key, i.e. they are in depth, combining them eliminates the effect of the key. Let us call the two ciphertexts $Z$ and $D$, the key $Δ$ and the two plaintexts $⊕$ and $Z$. We then have:

If the two plaintexts can be worked out, the key can be recovered from either ciphertext-plaintext pair e.g.: $K$
 * $&chi;$ or

On 31 August 1941, two long messages were received that had the same indicator HQIBPEXEZMUG. The first seven characters of these two ciphertexts were the same, but the second message was shorter. The first 15 characters of the two messages were as follows (in Bletchley Park interpretation): John Tiltman tried various likely pieces of plaintext, i.e. a "cribs", against the $&psi;'$ string and found that the first plaintext message started with the German word SPRUCHNUMMER (message number). In the second plaintext, the operator had used the common abbreviation NR for NUMMER. There were more abbreviations in the second message, and the punctuation sometimes differed. This allowed Tiltman to work out, over ten days, the plaintext of both messages, as a sequence of plaintext characters discovered in $D$, could then be tried against $Za$ and vice versa. In turn, this yielded almost 4000 characters of key.

Members of the Research Section worked on this key to try to derive a mathematical description of the key generating process, but without success. Bill Tutte joined the section in October 1941 and was given the task. He had read chemistry and mathematics at Trinity College, Cambridge before being recruited to Bletchley Park. At his training course, he had been taught the Kasiski examination technique of writing out a key on squared paper with a new row after a defined number of characters that was suspected of being the frequency of repetition of the key. If this number was correct, the columns of the matrix would show more repetitions of sequences of characters than chance alone.

Tutte thought that it was possible that, rather than using this technique on the whole letters of the key, which were likely to have a long frequency of repetition, it might be worth trying it on the sequence formed by taking only one impulse (bit) from each letter, on the grounds that "the part might be cryptographically simpler than the whole". Given that the Tunny indicators used 25 letters (excluding J) for 11 of the positions, but only 23 letters for the twelfth, he tried Kasiski's technique on the first impulse of the key characters using a repetition of 25 × 23 = 575. This did not produce a large number of repetitions in the columns, but Tutte did observe the phenomenon on a diagonal. He therefore tried again with 574, which showed up repeats in the columns. Recognising that the prime factors of this number are 2, 7 and 41, he tried again with a period of 41 and "got a rectangle of dots and crosses that was replete with repetitions".

It was clear, however, that the sequence of first impulses was more complicated than that produced by a single wheel of 41 positions. Tutte called this component of the key &chi;1 (chi). He figured that there was another component, which was XOR-ed with this, that did not always change with each new character, and that this was the product of a wheel that he called &psi;1 (psi). The same applied for each of the five impulses—indicated here by subscripts. So for a single character, the key K consisted of two components:

The actual sequence of characters added by the psi wheels, including those when they do not advance, was referred to as the extended psi, and symbolised by &psi;&prime;

Tutte's derivation of the &psi; component was made possible by the fact that dots were more likely than not to be followed by dots, and crosses more likely than not to be followed by crosses. This was a product of a weakness in the German key setting, which they later stopped. Once Tutte had made this breakthrough, the rest of the Research Section joined in to study the other impulses, and it was established that the five &psi; wheels all moved together under the control of two &mu; (mu or "motor") wheels.

Diagnosing the functioning of the Tunny machine in this way was a truly remarkable cryptanalytical achievement, and was described when Tutte was inducted as Officer of the Order of Canada in October 2001, as "one of the greatest intellectual feats of World War II".

Turingery
In July 1942 Alan Turing spent a few weeks in the Research Section. He had become interested in the problem of breaking Tunny from the keys that had been obtained from depths. In July, he developed a method of deriving the cam settings ("wheel breaking") from a length of key. It became known as "Turingery" (playfully dubbed "Turingismus" by Peter Ericsson, Peter Hilton and Donald Michie) and introduced the important method of "differencing" on which much of the rest of solving Tunny keys in the absence of depths, was based.

Differencing
The search was on for a process that would manipulate the ciphertext or key to produce a frequency distribution of characters that departed from the uniformity that the enciphering process aimed to achieve. Turing worked out that the XOR combination of the values of successive (adjacent) characters in a stream of ciphertext or key, emphasised any departures from a uniform distribution. The resultant stream was called the difference (symbolised by the Greek letter "delta" Δ) because XOR is the same as modulo 2 subtraction. So, for a stream of characters S, the difference ΔS was obtained as follows, where underline indicates the succeeding character:



The stream S may be ciphertext Z, plaintext P, key K or either of its two components &chi; and &psi;. The relationship amongst these elements still applies when they are differenced. For example, as well as:



It is the case that:

Similarly for the ciphertext, plaintext and key components:



So:

The reason that differencing provided a way into Tunny, was that although the frequency distribution of characters in the ciphertext could not be distinguished from a random stream, the same was not true for a version of the ciphertext from which the chi element of the key had been removed. This is because, where the plaintext contained a repeated character and the psi wheels did not move on, the differenced psi character (Δ&psi;) would be the null character ('$Zb$' at Bletchley Park). When XOR-ed with any character, this character has no effect, so in these circumstances, $K$ The ciphertext modified by the removal of the chi component of the key was called the de-chi D at Bletchley Park, and the process of removing it as "de-chi-ing". Similarly for the removal of the psi component which was known as "de-psi-ing" (or "deep sighing" when it was particularly difficult).

So the delta de-chi ΔD was:

Repeated characters in the plaintext were more frequent both because of the characteristics of German (EE, TT, LL and SS are relatively common), and because telegraphists frequently repeated the figures-shift and letters-shift characters as their loss in an ordinary telegraph transmission could lead to gibberish.

To quote the General Report on Tunny:"Turingery introduced the principle that the key differenced at one, now called ΔΚ, could yield information unobtainable from ordinary key. This Δ principle was to be the fundamental basis of nearly all statistical methods of wheel-breaking and setting."

Differencing was applied to each of the impulses of the ITA2 coded characters. So, for the first impulse, that was enciphered by wheels &chi;1 and &psi;1, differenced at one:

And for the second impulse:

And so on.

The periodicity of the chi and psi wheels for each impulse (41 and 43 respectively for the first impulse) is also reflected in the pattern of ΔK. However, given that the psi wheels did not advance for every input character, as did the chi wheels, it was not simply a repetition of the pattern every 41 × 43 = 1763 characters for ΔK1, but a more complex sequence.

Turing's method
Turing's method of deriving the cam settings of the wheels from a length of key obtained from a depth, involved an iterative process. Given that the delta psi character was the null character ' / ' half of the time on average, an assumption that $Pa$ had a 50% chance of being correct. The process started by treating a particular ΔK character as being the Δ&chi; for that position. The resulting putative bit pattern of $Pb$ and $Za ⊕ Zb = Pa  ⊕  Pb$ for each chi wheel, was recorded on a sheet of paper that contained as many columns as there were characters in the key, and five rows representing the five impulses of the Δ&chi;. Given the knowledge from Tutte's work, of the periodicity of each of the wheels, this allowed the propagation of these values at the appropriate positions in the rest of the key.

A set of five sheets, one for each of the chi wheels, was also prepared. These contained a set of columns corresponding in number to the cams for the appropriate chi wheel, and were referred to as a 'cage'. So the &chi;3 cage had 29 such columns. Successive 'guesses' of Δ&chi; values then produced further putative cam state values. These might either agree or disagree with previous assumptions, and a count of agreements and disagreements was made on these sheets. Where disagreements substantially outweighed agreements, the assumption was made that the Δ&psi; character was not the null character ' / ', so the relevant assumption was discounted. Progressively, all the cam settings of the chi wheels were deduced, and from them, the psi and motor wheel cam settings.

As experience of the method developed, improvements were made that allowed it to be used with much shorter lengths of key than the original 500 or so characters."

Testery
The Testery was the section at Bletchley Park that performed the bulk of the work involved in decrypting Tunny messages. By July 1942, the volume of traffic was building up considerably. A new section was therefore set up, led by Ralph Tester—hence the name. The staff consisted mainly of ex-members of the Research Section, and included Peter Ericsson, Peter Hilton, Denis Oswald and Jerry Roberts. The Testery's methods were almost entirely manual, both before and after the introduction of automated methods in the Newmanry to supplement and speed up their work.

The first phase of the work of the Testery ran from July to October, with the predominant method of decryption being based on depths and partial depths. After ten days, however, the formulaic start of the messages was replaced by nonsensical quatsch, making decryption more difficult. This period was productive nonetheless, even though each decryption took considerable time. Finally, in September, a depth was received that allowed Turing's method of wheel breaking, "Turingery", to be used, leading to the ability to start reading current traffic. Extensive data about the statistical characteristics of the language of the messages was compiled, and the collection of cribs extended.

In late October 1942 the original, experimental Tunny link was closed and two new links (Codfish and Octopus) were opened. With these and subsequent links, the 12-letter indicator system of specifying the message key was replaced by the QEP system. This meant that only full depths could be recognised—from identical QEP numbers—which led to a considerable reduction in traffic decrypted.

Once the Newmanry became operational in June 1943, the nature of the work performed in the Testery changed, with decrypts, and wheel breaking no longer relying on depths.

British Tunny


The so-called "British Tunny Machine" was a device that exactly replicated the functions of the SZ40/42 machines. It was used to produce the German cleartext from a ciphertext tape, after the cam settings had been determined. The functional design was produced at Bletchley Park where ten Testery Tunnies were in use by the end of the war. It was designed and built in Tommy Flowers' laboratory at the General Post Office Research Station at Dollis Hill by Gil Hayward, "Doc" Coombs, Bill Chandler and Sid Broadhurst. It was mainly built from standard British telephone exchange electro-mechanical equipment such as relays and uniselectors. Input and output was by means of a teleprinter with paper tape reading and punching. These machines were used in both the Testery and later the Newmanry. Dorothy Du Boisson who was a machine operator and a member of the Women's Royal Naval Service (Wren), described plugging up the settings as being like operating an old fashioned telephone exchange and that she received electric shocks in the process.

When Flowers was invited by Hayward to try the first British Tunny machine at Dollis Hill by typing in the standard test phrase: "Now is the time for all good men to come to the aid of the party", he much appreciated that the rotor functions had been set up to provide the following Wordsworthian output:

Additional features were added to the British Tunnies to simplify their operation. Further refinements were made for the versions used in the Newmanry, the third Tunny being equipped to produce de-chi tapes.

Newmanry
The Newmanry was a section set up under Max Newman in December 1942 to look into the possibility of assisting the work of the Testery by automating parts of the processes of decrypting Tunny messages. Newman had been working with Gerry Morgan, head of the Research Section on ways of breaking Tunny when Bill Tutte approached them in November 1942 with the idea of what became known as the "1+2 break in". This was recognised as being feasible, but only if automated.

Newman produced a functional specification of what was to become the "Heath Robinson" machine. He recruited the Post Office Research Station at Dollis Hill, and Dr C.E. Wynn-Williams at the Telecommunications Research Establishment (TRE) at Malvern to implement his idea. Work on the engineering design started in January 1943 and the first machine was delivered in June. The staff at that time consisted of Newman, Donald Michie, Jack Good, two engineers and 16 Wrens. By the end of the war the Newmanry contained three Robinson machines, ten Colossus Computers and a number of British Tunnies. The staff were 26 cryptographers, 28 engineers and 275 Wrens.

The automation of these processes required the processing of large quantities of punched paper tape such as those on which the enciphered messages were received. Absolute accuracy of these tapes and their transcription was essential, as a single character in error could invalidate or corrupt a huge amount of work. Jack Good introduced the maxim "If it's not checked it's wrong".

The "1+2 break in"
W. T. Tutte developed a way of exploiting the non-uniformity of bigrams (adjacent letters) in the German plaintext using the differenced cyphertext and key components. His method was called the "1+2 break in," or "double-delta attack". The essence of this method was to find the initial settings of the chi component of the key by exhaustively trying all positions of its combination with the ciphertext, and looking for evidence of the non-uniformity that reflected the characteristics of the original plaintext. The wheel breaking process had to have successfully produced the current cam settings to allow the relevant sequence of characters of the chi wheels to be generated. It was totally impracticable to generate the 22 million characters from all five of the chi wheels, so it was initially limited to 41 × 31 = 1271 from the first two.

Given that for each of the five impulses i:

and hence

for the first two impulses:

Calculating a putative $Za ⊕ Pa = K$ in this way for each starting point of the $Zb ⊕ Pb = K$ sequence would yield xs and •s with, in the long run, a greater proportion of •s when the correct starting point had been used. Tutte knew, however, that using the differenced (∆) values amplified this effect because any repeated characters in the plaintext would always generate •, and similarly ∆&psi;1 ⊕ ∆&psi;2 would generate • whenever the psi wheels did not move on, and about half of the time when they did - some 70% overall.

Tutte analyzed a decrypted ciphertext with the differenced version of the above function:

and found that it generated • some 55% of the time. Given the nature of the contribution of the psi wheels, the alignment of chi-stream with the ciphertext that gave the highest count of •s from $Za$ was the one that was most likely to be correct. This technique could be applied to any pair of impulses and so provided the basis of an automated approach to obtaining the de-chi (D) of a ciphertext, from which the psi component could be removed by manual methods.

Robinsons
Heath Robinson was the first machine produced to automate Tutte's 1+2 method. It was given the name by the Wrens who operated it, after cartoonist William Heath Robinson, who drew immensely complicated mechanical devices for simple tasks, similar to the American cartoonist Rube Goldberg.

The functional specification of the machine was produced by Max Newman. The main engineering design was the work of Frank Morrell at the Post Office Research Station at Dollis Hill in North London, with his colleague Tommy Flowers designing the "Combining Unit". Dr C. E. Wynn-Williams from the Telecommunications Research Establishment at Malvern produced the high-speed electronic valve and relay counters. Construction started in January 1943, the prototype machine was in use at Bletchley Park in June.

The main parts of the machine were:
 * a tape transport and reading mechanism (dubbed the "bedstead" because of its resemblance to an upended metal bed frame) that ran the looped key and message tapes at between 1000 and 2000 characters per second;
 * a combining unit that implemented the logic of Tutte's method;
 * a counting unit that counted the number of •s, and if it exceeded a pre-set total, displayed or printed it.

The prototype machine was effective despite a number of serious shortcomings. Most of these were progressively overcome in the development of what became known as "Old Robinson".

Colossus


Tommy Flowers' had reservations about Heath Robinson's two synchronised tape loops, and his previous, unique experience of thermionic valves (vacuum tubes) led him to realize that a better machine could be produced using electronics. Instead of the key stream being read from a second punched paper tape, an electronically generated key stream could allow much faster and more flexible processing. Flowers' suggestion that this could be achieved with a machine that was entirely electronic and would contain between one and two thousand valves, was treated with incredulity at both the Telecommunications Research Establishment and at Bletchley Park, as it was thought that it would be "too unreliable to do useful work". He did, however, have the support of the Controller of Research at Dollis Hill, W Gordon Radley, and he implemented these ideas producing Colossus, the world's first electronic, digital, computing machine that was at all programmable, in the remarkably short time of ten months. In this he was assisted by his colleagues at the Post Office Research Station Dollis Hill: Sidney Broadhurst, William Chandler, Allen Coombs and Harry Fensom.

The prototype Mark 1 Colossus (Colossus I), with its 1500 valves, became operational at Dollis Hill in December 1943 and became fully operational at Bletchley Park on 5 February 1944. This processed the message at 5000 characters per second using the impulse from reading the tape's sprocket holes to act as the clock signal. It quickly became evident that this was a huge leap forward in cryptanalysis of Tunny. Further Colossus machines were ordered and the orders for more Robinsons cancelled. An improved Mark 2 Colossus (Colossus II) contained 2400 valves and first worked at Bletchley Park on 1 June 1944, just in time for the D-day Normandy landings.

The main parts of this machine were:
 * a tape transport and reading mechanism (the "bedstead") that ran the message tape in a loop at 5000 characters per second;
 * a unit that generated the key stream electronically;
 * five parallel processing units that could be programmed to perform a large range of Boolean operations (in the Mark II Colossus);
 * five counting units that each counted the number of •s or xs, and if it exceeded a pre-set total, printed it out.

The five parallel processing units allowed Tutte's "1+2 break in" and other functions to be run at an effective speed of 25,000 characters per second by the use of circuitry invented by Flowers that would now be called a shift register. Donald Michie worked out a method of using Colossus to assist in wheel breaking as well as for wheel setting in early 1944. This was then implemented in special hardware on later Colossi.

A total of ten Colossus computers were in use and an eleventh was being commissioned at the end of the war in Europe (VE-Day). Of the ten, seven were used for "wheel setting" and 3 for "wheel breaking".

Special machines
As well as the commercially produced teleprinters and re-perforators, a number of other machines were built to assist in the preparation and checking of tapes in the Newmanry and Testery. The approximate complement as of May 1945 was as follows.

Steps in wheel setting
Working out the start position of the chi (&chi;) wheels required first that their cam settings had been determined by "wheel breaking". Initially, this was achieved by two messages having been sent in depth.

The number of start positions for the first two wheels, &chi;1 and &chi;2 was 41×31 = 1271. The first step was to try all of these start positions against the message tape. This was Tutte's "1+2 break in" which involved computing $Zb$—which gives a putative ($Za ⊕ Zb$)—and counting the number of times this gave •. Incorrect starting positions would, on average, give a dot count of 50% of the message length. On average, the dot count for a correct starting point would be 54%, but there was inevitably a considerable spread of values around these averages.

Both Heath Robinson, which was developed into what became known as "Old Robinson", and Colossus were designed to automate this process. Statistical theory allowed the derivation of measures of how far any count was from the 50% expected with an incorrect starting point for the chi wheels. This measure of deviation from randomness was called sigma. Starting points that gave a count of less than 2.5 × sigma, named the "set total", were not printed out. The ideal for a run to set &chi;1 and &chi;2 was that a single pair of trial values produced one outstanding value for sigma thus identifying the start positions of the first two chi wheels. An example of the output from such a run on a Mark 2 Colossus with its five counters: a, b, c, d and e, is given below.

With an average-sized message, this would take about eight minutes. However, by utilising the parallelism of the Mark 2 Colossus, the number of times the message had to be read could be reduced by a factor of five, from 1271 to 255. Having identified possible &chi;1, &chi;2 start positions, the next step was to try to find the start positions for the other chi wheels. In the example given above, there is a single setting of &chi;1 = 36 and &chi;2 = 21 whose sigma value makes it stand out from the rest. This was not always the case, and Small enumerates 36 different further runs that might be tried according to the result of the &chi;1, &chi;2 run. At first the choices in this iterative process were made by the cryptanalyst sitting at the typewriter output, and calling out instructions to the Wren operators. Max Newman devised a decision tree and then set Jack Good and Donald Michie the task of devising others. These were used by the Wrens without recourse to the cryptanalysts if certain criteria were met.

In the above one of Small's examples, the next run was with the first two chi wheels set to the start positions found and three separate parallel explorations of the remaining three chi wheels. Such a run was called a "short run" and took about two minutes.

So the probable start positions for the chi wheels are: &chi;1 = 36, &chi;2 = 21, &chi;3 = 01, &chi;4 = 19, &chi;5 = 04. These had to be verified before the de-chi (D) message was passed to the Testery. This involved Colossus performing a count of the frequency of the 32 characters in ΔD. Small describes the check of the frequency count of the ΔD characters as being the "acid test", and that practically every cryptanalyst and Wren in the Newmanry and Testery knew the contents of the following table by heart.

If the derived start points of the chi wheels passed this test, the de-chi-ed message was passed to the Testery where manual methods were used to derive the psi and motor settings. As Small remarked, the work in the Newmanry took a great amount of statistical science, whereas that in the Testery took much knowledge of language and was of great interest as an art. Cryptanalyst Jerry Roberts made the point that this Testery work was a greater load on staff than the automated processes in the Newmanry.