Markov's principle

Markov's principle (also known as the Leningrad principle ), named after Andrey Markov Jr, is a conditional existence statement for which there are many equivalent formulations, as discussed below. The principle is logically valid classically, but not in intuitionistic constructive mathematics. However, many particular instances of it are nevertheless provable in a constructive context as well.

History
The principle was first studied and adopted by the Russian school of constructivism, together with choice principles and often with a realizability perspective on the notion of mathematical function.

In computability theory
In the language of computability theory, Markov's principle is a formal expression of the claim that if it is impossible that an algorithm does not terminate, then for some input it does terminate. This is equivalent to the claim that if a set and its complement are both computably enumerable, then the set is decidable. These statements are provable in classical logic.

In intuitionistic logic
In predicate logic, a predicate P over some domain is called decidable if for every x in the domain, either P(x) holds, or the negation of P(x) holds. This is not trivially true constructively.

Markov's principle then states: For a decidable predicate P over the natural numbers, if P cannot be false for all natural numbers n, then it is true for some n. Written using quantifiers:


 * $$\Big(\forall n \big(P(n) \vee \neg P(n)\big) \wedge \neg \forall n\, \neg P(n)\Big) \rightarrow \exists n\, P(n)$$

Markov's rule
Markov's rule is the formulation of Markov's principle as a rule. It states that $$\exists n\;P(n)$$ is derivable as soon as $$\neg \neg \exists n\;P(n)$$ is, for $$P$$ decidable. Formally,


 * $$\forall n \big(P(n)\lor \neg P(n)\big),\ \neg \neg \exists n\;P(n)\ \ \vdash\ \ \exists n\;P(n)$$

Anne Troelstra proved that it is an admissible rule in Heyting arithmetic. Later, the logician Harvey Friedman showed that Markov's rule is an admissible rule in first-order intuitionistic logic, Heyting arithmetic, and various other intuitionistic theories, using the Friedman translation.

In Heyting arithmetic
Markov's principle is equivalent in the language of arithmetic to:
 * $$\neg \neg \exists n\;f(n)=0 \rightarrow \exists n\;f(n)=0$$

for $$f$$ a total recursive function on the natural numbers.

In the presence of Church's thesis principle, the principle is equivalent to its form for primitive recursive functions. Using Kleene's T predicate, the latter may be expressed as
 * $$\forall e\;\forall x\;\big(\neg \neg \exists w\; T_1(e, x, w) \rightarrow \exists w\; T_1(e, x, w)\big)$$

Realizability
If constructive arithmetic is translated using realizability into a classical meta-theory that proves the $\omega$-consistency of the relevant classical theory (for example, Peano arithmetic if we are studying Heyting arithmetic), then Markov's principle is justified: a realizer is the constant function that takes a realization that $$P$$ is not everywhere false to the unbounded search that successively checks if $$P(0), P(1), P(2),\dots$$ is true. If $$P$$ is not everywhere false, then by $$\omega$$-consistency there must be a term for which $$P$$ holds, and each term will be checked by the search eventually. If however $$P$$ does not hold anywhere, then the domain of the constant function must be empty, so although the search does not halt it still holds vacuously that the function is a realizer. By the Law of the Excluded Middle (in our classical metatheory), $$P$$ must either hold nowhere or not hold nowhere, therefore this constant function is a realizer.

If instead the realizability interpretation is used in a constructive meta-theory, then it is not justified. Indeed, for first-order arithmetic, Markov's principle exactly captures the difference between a constructive and classical meta-theory. Specifically, a statement is provable in Heyting arithmetic with extended Church's thesis if and only if there is a number that provably realizes it in Heyting arithmetic; and it is provable in Heyting arithmetic with extended Church's thesis and Markov's principle if and only if there is a number that provably realizes it in Peano arithmetic.

In constructive analysis
Markov's principle is equivalent, in the language of real analysis, to the following principles:


 * For each real number x, if it is contradictory that x is equal to 0, then there exists a rational number y such that 0 < y < |x|, often expressed by saying that x is apart from, or constructively unequal to, 0.
 * For each real number x, if it is contradictory that x is equal to 0, then there exists a real number y such that xy = 1.

Modified realizability does not justify Markov's principle, even if classical logic is used in the meta-theory: there is no realizer in the language of simply typed lambda calculus as this language is not Turing-complete and arbitrary loops cannot be defined in it.

Weak Markov's principle
The weak Markov's principle is a weaker form of the principle. It may be stated in the language of analysis, as a conditional statement for the positivity of a real number:
 * $$\forall (x\in\mathbb{R})\, \Big(\forall(y\in\mathbb{R}) \big(\neg\neg(0 < y) \lor \neg\neg(y < x)\big)\Big) \,\to\, (0 < x).$$

This form can be justified by Brouwer's continuity principles, whereas the stronger form contradicts them. Thus the weak Markov principle can be derived from intuitionistic, realizability, and classical reasoning, in each case for different reasons, but it is not valid in the general constructive sense of Bishop, nor provable in the set theory ${\mathsf {IZF}}$.

To understand what the principle is about, it helps to inspect a stronger statement. The following expresses that any real number $$x$$, such that no non-positive $$y$$ is not below it, is positive:
 * $$\nexists(y \le 0)\, x \le y \,\to\, (0 < x),$$

where $$x \leq y$$ denotes the negation of $$y < x$$. This is a stronger implication because the antecedent is looser. Note that here a logically positive statement is concluded from a logically negative one. It is implied by the weak Markov's principle when elevating the De Morgan's law for $$\neg A\lor \neg B$$ to an equivalence.

Assuming classical double-negation elimination, the weak Markov's principle becomes trivial, expressing that a number larger than all non-positive numbers is positive.

Extensionality of functions
A function $$f: X \to Y$$ between metric spaces is called strongly extensional if $$d(f(x),f(y)) > 0 $$ implies $$d(x,y) > 0$$, which is classically just the contraposition of the function preserving equality. Markov's principle can be shown to be equivalent to the proposition that all functions between arbitrary metric spaces are strongly extensional, while the weak Markov's principle is equivalent to the proposition that all functions from complete metric spaces to metric spaces are strongly extensional.