Wikipedia:Reference desk/Archives/Mathematics/2012 April 5

= April 5 =

Series
Let's say you have a function, and you can find anti-derivatives of any order. (The example that I have in mind if $$f(x) = |x|$$.) Next, you sum all of these anti-derivatives to give, hopefully, a new function. In the case of $$f(x) = |x|$$ you get
 * $$ \sigma(x) = \frac{|x|}{x}(e^x - 1) \, . $$

Is there a name for this kind of construction? Can anyone point me towards any interesting references? — Fly by Night  ( talk )  01:55, 5 April 2012 (UTC)
 * If f is differentiable then σ satisfies the first order differential equation σ' - σ = f'. Rckrone (talk) 04:36, 5 April 2012 (UTC)
 * Anti-derivatives are not unique and therefore neither will be your resulting $$\sigma$$. I guess you are implicitly assuming initial conditions such as $$\sigma^{(n)}(0) = k_n$$ for any n. I've never encountered this before.Widener (talk) 07:17, 5 April 2012 (UTC)


 * Good point, and the solution to that differential equation is $$f(x) + c e^x + e^x \int_0^x f(y) e^{-y} dy$$, which seems to give the desired answer for $$c = 1$$.--Itinerant1 (talk) 09:12, 5 April 2012 (UTC)
 * I think you must mean, when $$c = 0$$. I just tested it with $$x=1$$ using the example Fly By Night gave. $$\frac{|1|}{1}(e^1 - 1) = |1| + ce^1 + e^1 \int_0^1 |y| e^{-y} dy \implies e-1 = (c+1)e - 1$$. This is what you get if you assume $$\sigma(0) = f(0)$$ for a general sigma. Widener (talk) 10:32, 5 April 2012 (UTC)


 * That's right, I'd be setting all of the constants of integration equal to zero. After all:
 * $$ \ker\left( \frac{\operatorname{d}^{n+1}}{\operatorname{d}\! x^{n+1}} \right) = \{ c_0 + c_1x + \cdots + c_nx^n\} \, . $$
 * When we find the anti-derivatives of a function, we get a function plus an arbitrary polynomial, e.g.
 * $$ \int \left(\int |x| \, \operatorname{d}\! x\right) \operatorname{d}\! x = \frac{|x|x^2}{3!} + c_1x + c_0 \, . $$
 * If we work out all of the anti-derivatives and then sum, we get a class of functions:
 * $$ [\sigma] = \frac{|x|}{x}(e^x-1) + \Rx $$
 * It's the leading term in [&sigma;] that I'm interested in, i.e. the class member corresponding to the zero power series (0 ∈ R x ). —  Fly by Night  ( talk )  11:25, 6 April 2012 (UTC)

Define a mapping $$J$$ on a suitable function space (say $$L^1$$, although you can do this with spaces of measures too) by
 * $$(Jf)(x) = \int_0^x f(t)\,dt.$$

You want to compute the resolvent operator $$R=(I-J)^{-1}$$ (by the sum of the geometric series). A concrete formula for this is possible using the Fourier transform:
 * $$Rf(x) = \mathcal{F}_x^{-1}\left(\frac{2\pi i \xi}{1+2\pi i\xi} \mathcal{F}_\xi f\right).$$

(This may be up to a constant like $$f(0)$$. I didn't keep careful track of delta functions when computing this.)   Sławomir Biały  (talk) 12:35, 6 April 2012 (UTC)