Differentiation rules

This is a summary of differentiation rules, that is, rules for computing the derivative of a function in calculus.

Elementary rules of differentiation
Unless otherwise stated, all functions are functions of real numbers (R) that return real values; although more generally, the formulae below apply wherever they are well defined — including the case of complex numbers (C).

Constant term rule
For any value of $$c$$, where $$c \in \mathbb{R}$$, if $$f(x)$$ is the constant function given by $$f(x) = c$$, then $$\frac{df}{dx} = 0$$.

Proof
Let $$c \in \mathbb{R}$$ and $$f(x) = c$$. By the definition of the derivative,


 * $$\begin{align}

f'(x) &= \lim_{h \to 0}\frac{f(x + h) - f(x)}{h} \\ &= \lim_{h \to 0} \frac{(c) - (c)}{h} \\ &= \lim_{h \to 0} \frac{0}{h} \\ &= \lim_{h \to 0} 0 \\ &= 0 \end{align}$$

This shows that the derivative of any constant function is 0.

Intuitive (geometric) explanation
The derivative of the function at a point is the slope of the line tangent to the curve at the point. Slope of the constant function is zero, because the tangent line to the constant function is horizontal and its angle is zero.

In other words, the value of the constant function, y, will not change as the value of x increases or decreases.



Differentiation is linear
For any functions $$f$$ and $$g$$ and any real numbers $$a$$ and $$b$$, the derivative of the function $$h(x) = af(x) + bg(x)$$ with respect to $$x$$ is: $$ h'(x) = a f'(x) + b g'(x).$$

In Leibniz's notation this is written as: $$ \frac{d(af+bg)}{dx} = a\frac{df}{dx} +b\frac{dg}{dx}.$$

Special cases include:
 * The constant factor rule $$(af)' = af' $$
 * The sum rule $$(f + g)' = f' + g'$$
 * The difference rule $$(f - g)' = f' - g'.$$

The product rule
For the functions $$f$$ and $$g$$, the derivative of the function $$h(x) = f(x) g(x)$$ with respect to $$x$$ is $$ h'(x) = (fg)'(x) = f'(x) g(x) + f(x) g'(x).$$ In Leibniz's notation this is written $$\frac{d(fg)}{dx} = g \frac{df}{dx} + f \frac{dg}{dx}.$$

The chain rule
The derivative of the function $$h(x) = f(g(x))$$ is $$ h'(x) = f'(g(x))\cdot g'(x).$$

In Leibniz's notation, this is written as: $$\frac{d}{dx}h(x) = \left.\frac{d}{dz}f(z)\right|_{z=g(x)}\cdot \frac{d}{dx}g(x),$$ often abridged to $$\frac{dh(x)}{dx} = \frac{df(g(x))}{dg(x)} \cdot \frac{dg(x)}{dx}.$$

Focusing on the notion of maps, and the differential being a map $$\text{D}$$, this is written in a more concise way as: $$ [\text{D} (f\circ g)]_x = [\text{D} f]_{g(x)} \cdot [\text{D}g]_x\,.$$

The inverse function rule
If the function $f$ has an inverse function $g$, meaning that $$g(f(x)) = x$$ and $$f(g(y)) = y,$$ then $$g' = \frac{1}{f'\circ g}.$$

In Leibniz notation, this is written as $$ \frac{dx}{dy} = \frac{1}{\frac{dy}{dx}}.$$

The polynomial or elementary power rule
If $$f(x) = x^r$$, for any real number $$r \neq 0,$$ then
 * $$f'(x) = rx^{r-1}.$$

When $$r = 1,$$ this becomes the special case that if $$f(x) = x,$$ then $$f'(x) = 1.$$

Combining the power rule with the sum and constant multiple rules permits the computation of the derivative of any polynomial.

The reciprocal rule
The derivative of $$h(x)=\frac{1}{f(x)}$$for any (nonvanishing) function $f$ is:


 * $$ h'(x) = -\frac{f'(x)}{(f(x))^2}$$ wherever $f$ is non-zero.

In Leibniz's notation, this is written


 * $$ \frac{d(1/f)}{dx} = -\frac{1}{f^2}\frac{df}{dx}.$$

The reciprocal rule can be derived either from the quotient rule, or from the combination of power rule and chain rule.

The quotient rule
If $f$ and $g$ are functions, then:
 * $$\left(\frac{f}{g}\right)' = \frac{f'g - g'f}{g^2}\quad$$ wherever $g$ is nonzero.

This can be derived from the product rule and the reciprocal rule.

Generalized power rule
The elementary power rule generalizes considerably. The most general power rule is the functional power rule: for any functions $f$ and $g$,


 * $$(f^g)' = \left(e^{g\ln f}\right)' = f^g\left(f'{g \over f} + g'\ln f\right),\quad$$

wherever both sides are well defined.

Special cases
 * If $f(x)=x^a\!$, then $f'(x)=ax^{a-1}$ when $a$ is any non-zero real number and $x$ is positive.
 * The reciprocal rule may be derived as the special case where $g(x)=-1\!$.

Derivatives of exponential and logarithmic functions

 * $$ \frac{d}{dx}\left(c^{ax}\right) = {ac^{ax} \ln c } ,\qquad c > 0$$

the equation above is true for all $c$, but the derivative for $c<0$ yields a complex number.


 * $$ \frac{d}{dx}\left(e^{ax}\right) = ae^{ax}$$


 * $$ \frac{d}{dx}\left( \log_c x\right) = {1 \over x \ln c}, \qquad c > 1$$

the equation above is also true for all $c$, but yields a complex number if $c<0\!$.


 * $$ \frac{d}{dx}\left( \ln x\right) = {1 \over x} ,\qquad x > 0.$$


 * $$ \frac{d}{dx}\left( \ln |x|\right) = {1 \over x} ,\qquad x \neq 0.$$


 * $$ \frac{d}{dx}\left( W(x)\right) = {1 \over {x+e^{W(x)}}} ,\qquad x > -{1 \over e}.\qquad$$where $$W(x)$$ is the Lambert W function


 * $$ \frac{d}{dx}\left( x^x \right) = x^x(1+\ln x).$$


 * $$ \frac{d}{dx}\left( f(x)^{ g(x) } \right ) = g(x)f(x)^{g(x)-1} \frac{df}{dx} + f(x)^{g(x)}\ln{( f(x) )}\frac{dg}{dx}, \qquad \text{if }f(x) > 0, \text{ and if } \frac{df}{dx} \text{ and } \frac{dg}{dx} \text{ exist.}$$


 * $$ \frac{d}{dx}\left( f_{1}(x)^{f_{2}(x)^{\left ( ... \right )^{f_{n}(x)}}} \right ) = \left [\sum\limits_{k=1}^{n} \frac{\partial }{\partial x_{k}} \left( f_{1}(x_1)^{f_{2}(x_2)^{\left ( ... \right )^{f_{n}(x_n)}}} \right ) \right ] \biggr\vert_{x_1 = x_2 = ... =x_n = x}, \text{ if } f_{i 0 \text{ and }$$ $$ \frac{df_{i}}{dx} \text{ exists. }$$

Logarithmic derivatives
The logarithmic derivative is another way of stating the rule for differentiating the logarithm of a function (using the chain rule):
 * $$ (\ln f)'= \frac{f'}{f} \quad$$ wherever $f$ is positive.

Logarithmic differentiation is a technique which uses logarithms and its differentiation rules to simplify certain expressions before actually applying the derivative.

Logarithms can be used to remove exponents, convert products into sums, and convert division into subtraction — each of which may lead to a simplified expression for taking derivatives.

Derivatives of trigonometric functions
The derivatives in the table above are for when the range of the inverse secant is $$[0,\pi]\!$$ and when the range of the inverse cosecant is $$\left[-\frac{\pi}{2},\frac{\pi}{2}\right].$$

It is common to additionally define an inverse tangent function with two arguments, $$\arctan(y,x).$$ Its value lies in the range $$[-\pi,\pi]$$ and reflects the quadrant of the point $$(x,y).$$  For the first and fourth quadrant (i.e. $$x > 0$$) one has $$\arctan(y, x>0) = \arctan(y/x).$$  Its partial derivatives are
 * $$ \frac{\partial \arctan(y,x)}{\partial y} = \frac{x}{x^2 + y^2} \qquad\text{and}\qquad \frac{\partial \arctan(y,x)}{\partial x} = \frac{-y}{x^2 + y^2}.$$

Derivatives of hyperbolic functions
See Hyperbolic functions for restrictions on these derivatives.

Derivatives of special functions

 * Gamma function
 * $$\Gamma(x) = \int_0^\infty t^{x-1} e^{-t}\, dt$$
 * $$\begin{align}

\Gamma'(x) & = \int_0^\infty t^{x-1} e^{-t} \ln t\,dt \\ & = \Gamma(x) \left(\sum_{n=1}^\infty \left(\ln\left(1 + \dfrac{1}{n}\right) - \dfrac{1}{x + n}\right) - \dfrac{1}{x}\right) \\ & = \Gamma(x) \psi(x) \end{align}$$ with $$\psi(x)$$ being the digamma function, expressed by the parenthesized expression to the right of $$\Gamma(x)$$ in the line above.
 * Riemann zeta function
 * $$\zeta(x) = \sum_{n=1}^\infty \frac{1}{n^x}$$
 * $$\begin{align}

\zeta'(x) & = -\sum_{n=1}^\infty \frac{\ln n}{n^x} =-\frac{\ln 2}{2^x} - \frac{\ln 3}{3^x} - \frac{\ln 4}{4^x} - \cdots \\ & = -\sum_{p \text{ prime}} \frac{p^{-x} \ln p}{(1-p^{-x})^2} \prod_{q \text{ prime}, q \neq p} \frac{1}{1-q^{-x}} \end{align}$$

Derivatives of integrals
Suppose that it is required to differentiate with respect to x the function


 * $$F(x)=\int_{a(x)}^{b(x)}f(x,t)\,dt,$$

where the functions $$f(x,t)$$ and $$\frac{\partial}{\partial x}\,f(x,t)$$ are both continuous in both $$t$$ and $$x$$ in some region of the $$(t,x)$$ plane, including $$a(x)\leq t\leq b(x),$$ $$x_0\leq x\leq x_1$$, and the functions $$a(x)$$ and $$b(x)$$ are both continuous and both have continuous derivatives for $$x_0\leq x\leq x_1$$. Then for $$\,x_0\leq x\leq x_1$$:


 * $$ F'(x) = f(x,b(x))\,b'(x) - f(x,a(x))\,a'(x) + \int_{a(x)}^{b(x)} \frac{\partial}{\partial x}\, f(x,t)\; dt\,. $$

This formula is the general form of the Leibniz integral rule and can be derived using the fundamental theorem of calculus.

Derivatives to nth order
Some rules exist for computing the $n$-th derivative of functions, where $n$ is a positive integer. These include:

Faà di Bruno's formula
If $f$ and $g$ are $n$-times differentiable, then $$ \frac{d^n}{d x^n} [f(g(x))]= n! \sum_{\{k_m\}} f^{(r)}(g(x)) \prod_{m=1}^n \frac{1}{k_m!} \left(g^{(m)}(x) \right)^{k_m}$$ where $ r = \sum_{m=1}^{n-1} k_m$ and the set $$ \{k_m\}$$ consists of all non-negative integer solutions of the Diophantine equation $ \sum_{m=1}^{n} m k_m = n$.

General Leibniz rule
If $f$ and $g$ are $n$-times differentiable, then $$ \frac{d^n}{dx^n}[f(x)g(x)] = \sum_{k=0}^{n} \binom{n}{k} \frac{d^{n-k}}{d x^{n-k}} f(x) \frac{d^k}{d x^k} g(x)$$

Sources and further reading
These rules are given in many books, both on elementary and advanced calculus, in pure and applied mathematics. Those in this article (in addition to the above references) can be found in:
 * Mathematical Handbook of Formulas and Tables (3rd edition), S. Lipschutz, M.R. Spiegel, J. Liu, Schaum's Outline Series, 2009, ISBN 978-0-07-154855-7.
 * The Cambridge Handbook of Physics Formulas, G. Woan, Cambridge University Press, 2010, ISBN 978-0-521-57507-2.
 * Mathematical methods for physics and engineering, K.F. Riley, M.P. Hobson, S.J. Bence, Cambridge University Press, 2010, ISBN 978-0-521-86153-3
 * NIST Handbook of Mathematical Functions, F. W. J. Olver, D. W. Lozier, R. F. Boisvert, C. W. Clark, Cambridge University Press, 2010, ISBN 978-0-521-19225-5.