Talk:Chebyshev function

Crossover Points
Does anyone know anything about the first integers where the Chebyshev functions evaluate to greater than their inputs? I think that would be a nice thing to include in this article, even if the answer is that the first such examples are unknown. Kyle1009 (talk) 03:24, 17 August 2014 (UTC)

I can't find the exact source, but I'm pretty sure that for x below 8*10^11 $$ \vartheta(x) < x $$ holds, although I can't prove or find a proof or counterexample to that assertion being true for all x.

From my tests, $$ \psi(x) > x $$ for many values of x. Those values below 100 are 19, 31, 32, 43, 47, 49, 53, 61, 73, 74, 75, 79, 83, 84, 89, and this inequality seems to hold for *most* large values of x. Pieater3.14159265 (talk) 21:54, 20 June 2015 (UTC)

Riemann hypothesis
I doubt the equation
 * $$\left|\sum_{\rho} \frac{x^{\rho}}{\rho}\right| \le \left(\sum_{\gamma} \frac{1}{\gamma}\right) \sqrt x = O(\sqrt x \log^2 x)$$

is meaningful. If the sum $$\sum_\gamma\frac1\gamma$$ is over all nontrivial roots (ordered according to the absolute value of the imaginary part, per the usual convention), then the positive and negative terms cancel, and the result is zero, which is clearly false. If the sum is intended to run only over roots with positive imaginary part, then it is divergent, as there are $$\Theta(\log n)$$ zeros with imaginary part in [n, n + 1]. I'm afraid that the reason for the $$O(\sqrt x\log^2x)$$ bound is much more subtle than what is alluded here. -- EJ 20:11, 30 August 2006 (UTC)

got a question..if $$ \Psi (x) \sim x $$ for big x then would it hold that  $$ \frac{d\Psi (x)}{dx} \sim 1 $$ for big x??? --85.85.100.144 13:40, 13 December 2006 (UTC)
 * I know I'm replying to a very old post, but just in case: since psi(x) has a slope of 0 at (all? almost all?) points, I think the derivative $$ \frac{d\Psi (x)}{dx} $$ would be zero as well. Pieater3.14159265 (talk) 01:02, 31 July 2015 (UTC)
 * Yes, i agree a derivative wouldn't make sense since it is a step function. Approxilator (talk) 21:18, 1 January 2024 (UTC)

Asymptotics
Are there any good asymptotics for the theta function? I see that A002110 has $$\vartheta(n)\sim(1 + o(1)) \cdot n \cdot \log(n) = \ln(n^n(1+o(1))^{n\log n})$$, but this seems to be worse than $$\vartheta(n)\sim n$$ and the latter is alredy an overstatement for $$0<n<2,000,000,000$$ by my calculation. Any thoughts? CRGreathouse (t | c) 04:32, 16 September 2006 (UTC)


 * I'm surprised better estimates aren't mentioned here. The relevant sections to look at would be Hardy and Wright's first chapter on the series of primes or almost any intro analytic number theory textbook. We have $$ \vartheta(x) \sim \phi(x) \sim x $$ (a statement which is in fact equivalent to the Prime Number Theorem. It is easy to show that $$ \vartheta(x) \sim \phi(x) $$ The second estimate is essentially how one normally proves the prime number theorem. Explicit results about the error bound are known - I think the best currently is due to Dusart. If you want, I can go look them up. JoshuaZ 04:59, 16 September 2006 (UTC)


 * Oh, I feel foolish now. I actually have Dusart's paper on my hard drive, but I forgot it had information on this.  I even have his research report, not just the MComp paper... I'll look through it and update the article.  Gosh do I feel stupid!


 * By the way—are there better results conditional on the RH? CRGreathouse (t | c) 05:07, 16 September 2006 (UTC)


 * Yes. In fact that is how one shows that's one standard RH gives better bounds on the PNT. The rough idea is that as zeros are pushed away from the zero line we get better estimates on $$ \vartheta(x) $$ and $$ \phi(x)  $$ and hence a better bound on $$ \Pi(x) $$. The relevant chapter of Hardy and Wright (or Apostol's Intro Analytic Number Theory) should give a decent explanation of the basic ideas. Edward's book on the zeta function has all the relevant details worked out. JoshuaZ 05:42, 16 September 2006 (UTC)

Third inequality has  reference to x. this cannot be correct since left side does not mention x at all. —Preceding unsigned comment added by 135.245.10.2 (talk) 17:56, 13 January 2011 (UTC)

Using the syntax of the Wolfram Language, the asymptotic for the theta function is Sum[MoebiusMu[n] x^(1/n), {n, 1, Floor[Log[2, x]]}], which is illustrated at First Chebyshev Function.StvC (talk) 16:40, 5 November 2016 (UTC)

Tchebychev function
I saw some manuscripts that stablishe RH is true because the Tchebychev function has this asymptotic $$\Psi(x) \ \sim \ x^{1/2}$$ but if the Schmidt result is correct this is impossible. Moreover, I was trying to understand the Schmidt result and I could find the result that appears in wikipedia. So I hope somebody could explain what is correct.

Erhard Schmidt, "Über die Anzahl der Primzahlen unter gegebener Grenze", Mathematische Annalen, 57 (1903), pp.195-204.

I also cannot find this result in the Schmidt paper. He states 4 theorems. Kindly indicate which one is it that you think is the one cited in Wikipedia. Also, Schmidt does not seem to discuss psi(x). —Preceding unsigned comment added by 135.245.10.2 (talk) 17:47, 17 January 2011 (UTC)
 * Hmm. As far as I can see, Schmidt proves in his paper that there are infinitely many x such that
 * $$\Pi(x)-\operatorname{Li}(x)>\tfrac1{29}\frac{\sqrt x}{\log x},$$
 * and infinitely many x such that
 * $$\Pi(x)-\operatorname{Li}(x)<-\tfrac1{29}\frac{\sqrt x}{\log x}.$$
 * That is speaks of Π instead of ψ is no problem, partial summation applied to the above results immediately implies that there are arbitrarily large x with
 * $$\psi(x)-x>K\sqrt x,$$
 * and arbitrarily large x with
 * $$\psi(x)-x<-K\sqrt x,$$
 * where K is any constant smaller than 1/29. What is a problem is that this constant has, indeed, to be smaller than 1/29, whereas we want it to be arbitrarily large. Now, the question is what is in the Hardy and Littlewood paper.—Emil J. 18:27, 17 January 2011 (UTC)
 * Hardy and Littlewood state Schmidt's result essentially as I wrote above (without specifying a value for K, just saying that it exists). They also prove the stronger result that there is a constant K such that there are arbitrarily large x with
 * $$\psi(x)-x>K\sqrt x\log\log\log x,$$
 * and arbitrarily large x with
 * $$\psi(x)-x<-K\sqrt x\log\log\log x.$$
 * This does imply that Schmidt's formula holds for every K, but it is no longer Schmidt's result.—Emil J. 18:49, 17 January 2011 (UTC)

Seems to me that the article would be improved by just stating the Hardy and Littlewood result and omitting the Schmidt result. Do you happen to know if theta(x)-x changes sign infinitely often? —Preceding unsigned comment added by 135.245.8.4 (talk) 15:51, 18 January 2011 (UTC)

As far as I know, theta(x) < x, at least for x<8*10^11 (some Dusart paper), but I don't know whether or not it's true for all x (it never changes sign). If you find out, let me know! Pieater3.14159265 (talk) 02:28, 23 June 2015 (UTC)

Chebyshev function
The text of the first section states "The Chebyshev function is often used in proofs relate ...." Yet the text prior states two different Chebyshev functions. Which one or both? — Preceding unsigned comment added by Reddwarf2956 (talk • contribs) 12:29, 18 September 2012 (UTC)

From my experience, both are used in proofs related to prime numbers, but the second Chebyshev function $$ \psi(x) $$ is probably used more often especially in proofs relating to the prime counting function. Pieater3.14159265 (talk) 21:39, 20 June 2015 (UTC)

Bounds on Chebyshev's First Function
I have two questions relating to Chebyshev's First Function. First, is it known that $$ \vartheta(x) < x $$ for all x (or that it's not)? Are there any conjectures that would imply this? Are there any other helpful bounds on $$ \vartheta(x) $$ given by someone other than Dusart (I think I've found all of his papers on the subject) without assuming RH? The only other ones I can find are slight improvements on his, which don't seem to be all that helpful. Second, is there any good way of calculating $$ \vartheta(x) $$ other than summing the logarithms of primes? This ends up either taking too long or losing a startling degree of accuracy. Pieater3.14159265 (talk) 07:24, 19 June 2015 (UTC)

Upper Bound on theta(x)
Does anyone object to my replacing the upper bound on $$ \theta(x) $$ currently here (from Rosser and Schoenfeld) with Pierre Dusart's bound ($$ \theta(x) < 1.000028x $$ from http://arxiv.org/pdf/1002.0442v1.pdf, page 2)? There it is listed as $$ \theta(x)-x < \frac{1}{36260}x$$. Pieater3.14159265 (talk) 23:18, 10 August 2015 (UTC)

Log vs Ln
I think "ln" is a better notation than "log" for the following reasons:


 * Consistency: At present this article uses both log and ln for the same function
 * Clarity: ln is unambiguous contrary to log
 * Compliance with standards: ln is probably more widely recognizable, and recommended by ISO.

The only argument in favor of "log" seems to be that "log" is more accepted in the field of number theory. I think this argument weighs less than the previous three.2A01:CB00:A34:1000:ED0E:D7C1:D5BD:64BC (talk) 14:21, 13 June 2021 (UTC)

Variational formulation
It appears this section is incorrect. See https://math.stackexchange.com/q/3339574.

StvC (talk) 19:56, 18 June 2022 (UTC)


 * Removed. Another way this can be shown to be false is by, for fixed $$c>1$$, choosing a sequence of functions tending to 1, e.g. of form $$f(x)=e^{-dx}$$ where $$d \to 0^+ $$. The first term is negative and the second term goes to infinity. Kclisp (talk) 06:50, 30 November 2023 (UTC)