Wikipedia:Reference desk/Archives/Mathematics/2020 April 17

= April 17 =

Prime factor of 2^2^33+1
How much is known about the requirements for a prime factor of this big number?? I specifically want to know the lower bound of a prime factor. Georgia guy (talk) 13:44, 17 April 2020 (UTC)
 * It is the Fermat number F33. As you probably know since you ask, it is the smallest Fermat number with no known factor and unknown primality status (F20 and F24 have no known factor but are known to be composite). Per Fermat number, any factor of F33 must be of form k × 235 + 1. According to http://www.prothsearch.com/fermat.html#Search, the search limit for k was 4.8 × 1017 on January 25, 2020. It appears from http://www.fermatsearch.org/stat/done.php to be 4.96 × 1017 on April 3, 2020. That means a factor must be above 1.7 × 1028. PrimeHunter (talk) 15:04, 17 April 2020 (UTC)
 * But this number is so huge that that is nowhere close to proving it is prime. It has more than 2 billion digits; the square of your figure is only 2.89 times 10^56. (In other words, your info would be sufficient for the number to be prime if the number were less than 2.89 times 10^56.) Georgia guy (talk) 15:31, 17 April 2020 (UTC)
 * In general it's much easier to test for primality than to actually find factors, and in this case you'd run Pépin's test (or something similar) to see if F33 is prime. The problem is that F33 is so huge that it's not even feasible to run that test in this case. Actually, imo it's unlikely that a substantially faster test will be found; incremental improvements maybe but probably not orders of magnitude. If so, then the limitation to testing the primality of F33 is technological rather than mathematical. --RDBury (talk) 17:29, 17 April 2020 (UTC)
 * F33 is almost certainly composite, like all other untested Fermat numbers. Fermat number says: "It is possible that the only primes of this form are 3, 5, 17, 257 and 65,537. Indeed, Boklan and Conway published in 2016 a very precise analysis suggesting that the probability of the existence of another Fermat prime is less than one in a billion. " It appears they didn't consider the trial factor limit. The current chance is actually a little better than one in a billion in my estimate. PrimeHunter (talk) 17:57, 17 April 2020 (UTC)


 * By the Von Staudt–Clausen theorem it's equivalent to
 * $$B_{2^{n}}\bmod 1 = \frac{7764758227}{8589934590}$$
 * for $$n\geq 16$$ where $$B_k$$ are the Bernoulli numbers. Count Iblis (talk) 19:59, 17 April 2020 (UTC)

Running a pseudoprime test on that number would be on the expensive and slow side, but seems doable with an accessible amount of computer resources compared to what's been spent on some factorization efforts and the like. The pseudoprime test can't prove primality, but it can prove compositeness and usually does (if the # is composite). So it would in all likelihood settle the primality question for any particular number. In the unlikelihood that the number passes pseudoprime tests, that would be of considerable mathematical interest in its own right, and it might be feasible or almost-feasible to perform an AKS primality test. 2601:648:8202:96B0:E0CB:579B:1F5:84ED (talk) 00:46, 18 April 2020 (UTC)
 * All efficient primality and pseudoprime tests rely on modular exponentiation which requires too much data sharing for a distributed Internet project (maybe Gigabytes every second for this size). It would probably have to run on a supercomputer for years. I don't see that getting funding with a one in a billion chance. Distributed computing projects are great for many other things like trial factoring, and prime searches at a size where each ordinary computer can test one number at a time. For example the Great Internet Mersenne Prime Search (GIMPS) which has set the 15 latest largest known prime records since 1996 (they found two more Mersenne primes out of order). PrimeHunter (talk) 01:23, 18 April 2020 (UTC)
 * By the way, an AKS primality test is completely infeasible at both this size and a million digits but it doesn't matter here since the fastest test for Fermat numbers is Pépin's test which proves primality. If you test a number N where the complete factorization of N−1 or N+1 is known then primality proving is efficient, at worst slower by a small constant factor than a pseudoprime test. There are also efficient cases with partial factorization of N+/–1 but nearly all large prime searches and therefore nearly all the largest known primes have complete factorization of N+1 or N-1. Usually by simply choosing N to be a product of small numbers, e.g. k × 2n for small k. PrimeHunter (talk) 01:35, 18 April 2020 (UTC)
 * FWIW, lists the current factorization status of Fermat numbers. Pépin's test is the fastest known test, there may be a faster one that no one has discovered yet, though after 143 years it seem unlikely and, as I mentioned above, very unlikely that it would be a substantial improvement. Also, factors of larger Fermat numbers are known, so proving compositeness is easy sometimes, mostly depending (I gather) on how large the smallest factor is. Finally, even if F33 were to be shown composite, either by Pépin's test or by finding a small factor, I have the feeling that instead of acknowledging the achievement in computing power and programming skill that this feat would require, we would instead just be having this same conversation about F34, which is the next candidate. --RDBury (talk) 08:19, 18 April 2020 (UTC)
 * The speed of Pépin's test depends on the speed of modular exponentiation and has been improved many times with new multiplication algorithms. A correct test just has to give a correct answer so this Fermat prime test is almost certainly correct: "Fn is prime iff n ≤ 4". Maybe somebody will prove it some day. Forms with infinitely many expected primes like Mersenne primes may be much harder. PrimeHunter (talk) 14:06, 18 April 2020 (UTC)