Bremermann's limit

Bremermann's limit, named after Hans-Joachim Bremermann, is a limit on the maximum rate of computation that can be achieved in a self-contained system in the material universe. It is derived from Einstein's mass–energy equivalency and the Heisenberg uncertainty principle, and is c2/h ≈ 1.3563925 × 1050 bits per second per kilogram.

This value establishes an asymptotic bound on adversarial resources when designing cryptographic algorithms, as it can be used to determine the minimum size of encryption keys or hash values required to create an algorithm that could never be cracked by a brute-force search. For example, a computer with the mass of the entire Earth operating at Bremermann's limit could perform approximately 1075 mathematical computations per second. If one assumes that a cryptographic key can be tested with only one operation, then a typical 128-bit key could be cracked in under 10−36 seconds. However, a 256-bit key (which is already in use in some systems) would take about two minutes to crack. Using a 512-bit key would increase the cracking time to approaching 1072 years, without increasing the time for encryption by more than a constant factor (depending on the encryption algorithms used).

The limit has been further analysed in later literature as the maximum rate at which a system with energy spread $$\Delta E$$ can evolve into an orthogonal and hence distinguishable state to another, $$\textstyle \Delta t = \frac{ \pi \hbar} {2 \Delta E}.$$ In particular, Margolus and Levitin have shown that a quantum system with average energy E takes at least time $$\textstyle \Delta t = \frac{ \pi \hbar} {2 E}$$ to evolve into an orthogonal state. This is one of the quantum speed limit theorems. However, it has been shown that chaining multiple computations or access to quantum memory in principle allow computational algorithms that require arbitrarily small amount of energy/time per one elementary computation step.