Part of a series of articles about 
Calculus 




Definitions 


Integration by 









In calculus, the product rule is a formula used to find the derivatives of products of two or more functions. It may be stated as
 $(f\cdot g)'=f'\cdot g+f\cdot g'$
or in Leibniz's notation
 ${\dfrac {d}{dx}}(u\cdot v)={\dfrac {du}{dx}}\cdot v+u\cdot {\dfrac {dv}{dx}}.$
In differentials notation, this can be written as
 $d(uv)=u\,dv+v\,du.$
In Leibniz's notation, the derivative of the product of three functions (not to be confused with Euler's triple product rule) is
 ${\dfrac {d}{dx}}(u\cdot v\cdot w)={\dfrac {du}{dx}}\cdot v\cdot w+u\cdot {\dfrac {dv}{dx}}\cdot w+u\cdot v\cdot {\dfrac {dw}{dx}}.$
Discovery
Discovery of this rule is credited to Gottfried Leibniz, who demonstrated it using differentials.^{[1]} (However, Child (2008) argues that it is due to Isaac Barrow.) Here is Leibniz's argument: Let u(x) and v(x) be two differentiable functions of x. Then the differential of uv is
 ${\begin{aligned}d(u\cdot v)&{}=(u+du)\cdot (v+dv)u\cdot v\\&{}=u\cdot dv+v\cdot du+du\cdot dv.\end{aligned}}$
Since the term du·dv is "negligible" (compared to du and dv), Leibniz concluded that
 $d(u\cdot v)=v\cdot du+u\cdot dv$
and this is indeed the differential form of the product rule. If we divide through by the differential dx, we obtain
 ${\frac {d}{dx}}(u\cdot v)=v\cdot {\frac {du}{dx}}+u\cdot {\frac {dv}{dx}}$
which can also be written in Lagrange's notation as
 $(u\cdot v)'=v\cdot u'+u\cdot v'.$
Examples
 Suppose we want to differentiate f(x) = x^{2} sin(x). By using the product rule, one gets the derivative f′(x) = 2x sin(x) + x^{2} cos(x) (since the derivative of x^{2} is 2x and the derivative of the sine function is the cosine function).
 One special case of the product rule is the constant multiple rule, which states: if c is a number and f(x) is a differentiable function, then cf(x) is also differentiable, and its derivative is (cf)′(x) = cf′(x). This follows from the product rule since the derivative of any constant is zero. This, combined with the sum rule for derivatives, shows that differentiation is linear.
 The rule for integration by parts is derived from the product rule, as is (a weak version of) the quotient rule. (It is a "weak" version in that it does not prove that the quotient is differentiable, but only says what its derivative is if it is differentiable.)
Proofs
Proof by factoring (Proof from first principles)
Let h(x) = f(x)g(x) and suppose that f and g are each differentiable at x. We want to prove that h is differentiable at x and that its derivative, h′(x), is given by f′(x)g(x) + f(x)g′(x). To do this, $f(x)g(x+\Delta x)f(x)g(x+\Delta x)$ (which is zero, and thus does not change the value) is added to the numerator to permit its factoring, and then properties of limits are used.
 ${\begin{aligned}h'(x)&=\lim _{\Delta x\to 0}{\frac {h(x+\Delta x)h(x)}{\Delta x}}\\&=\lim _{\Delta x\to 0}{\frac {f(x+\Delta x)g(x+\Delta x)f(x)g(x)}{\Delta x}}\\&=\lim _{\Delta x\to 0}{\frac {f(x+\Delta x)g(x+\Delta x)f(x)g(x+\Delta x)+f(x)g(x+\Delta x)f(x)g(x)}{\Delta x}}\\&=\lim _{\Delta x\to 0}{\frac {[f(x+\Delta x)f(x)]\cdot g(x+\Delta x)+f(x)\cdot [g(x+\Delta x)g(x)]}{\Delta x}}\\&=\lim _{\Delta x\to 0}{\frac {f(x+\Delta x)f(x)}{\Delta x}}\cdot \lim _{\Delta x\to 0}g(x+\Delta x)+\lim _{\Delta x\to 0}f(x)\cdot \lim _{\Delta x\to 0}{\frac {g(x+\Delta x)g(x)}{\Delta x}}\\&=f'(x)g(x)+f(x)g'(x).\end{aligned}}$
Brief proof
By definition, if $f,g:\mathbb {R} \rightarrow \mathbb {R}$ are differentiable at $x$ then we can write
 $f(x+h)=f(x)+f'(x)h+\psi _{1}(h)\qquad \qquad g(x+h)=g(x)+g'(x)h+\psi _{2}(h)$
such that $\lim _{h\to 0}{\frac {\psi _{1}(h)}{h}}=\lim _{h\to 0}{\frac {\psi _{2}(h)}{h}}=0$, also written $\psi _{1},\psi _{2}\sim o(h)$. Then:
 ${\begin{aligned}fg(x+h)fg(x)=(f(x)+f'(x)h+\psi _{1}(h))(g(x)+g'(x)h+\psi _{2}(h))fg(x)=f'(x)g(x)h+f(x)g'(x)h+o(h)\\[12pt]\end{aligned}}$
Taking the limit for small $h$ gives the result.
Quarter squares
There is a proof using quarter square multiplication which relies on the chain rule and on the properties of the quarter square function (shown here as q, i.e., with $q(x)={\tfrac {x^{2}}{4}}$):
 $f=q(u+v)q(uv),$
Differentiating both sides:
 ${\begin{aligned}f'&=q'(u+v)(u'+v')q'(uv)(u'v')\\&=\left({1 \over 2}(u+v)(u'+v')\right)\left({1 \over 2}(uv)(u'v')\right)\\&={1 \over 2}(uu'+vu'+uv'+vv'){1 \over 2}(uu'vu'uv'+vv')\\&=vu'+uv'\\&=uv'+u'v\end{aligned}}$
Chain rule
The product rule can be considered a special case of the chain rule for several variables.
 ${d(ab) \over dx}={\frac {\partial (ab)}{\partial a}}{\frac {da}{dx}}+{\frac {\partial (ab)}{\partial b}}{\frac {db}{dx}}=b{\frac {da}{dx}}+a{\frac {db}{dx}}.$
Nonstandard analysis
Let u and v be continuous functions in x, and let dx, du and dv be infinitesimals within the framework of nonstandard analysis, specifically the hyperreal numbers. Using st to denote the standard part function that associates to a finite hyperreal number the real infinitely close to it, this gives
 ${\begin{aligned}{\frac {d(uv)}{dx}}&=\operatorname {st} \left({\frac {(u+du)(v+dv)uv}{dx}}\right)\\&=\operatorname {st} \left({\frac {uv+u\cdot dv+v\cdot du+dv\cdot duuv}{dx}}\right)\\&=\operatorname {st} \left({\frac {u\cdot dv+(v+dv)\cdot du}{dx}}\right)\\&=u{\frac {dv}{dx}}+v{\frac {du}{dx}}.\end{aligned}}$
This was essentially Leibniz's proof exploiting the transcendental law of homogeneity (in place of the standard part above).
Smooth infinitesimal analysis
In the context of Lawvere's approach to infinitesimals, let dx be a nilsquare infinitesimal. Then du = u' dx and dv = v' dx, so that
 ${\begin{aligned}d(uv)&{}=(u+du)(v+dv)uv\\&{}=uv+u\cdot dv+v\cdot du+du\cdot dvuv\\&{}=u\cdot dv+v\cdot du+du\cdot dv\\&{}=u\cdot dv+v\cdot du\,\!\end{aligned}}$
since
 $du\cdot dv=u'v'(dx)^{2}=0$
Generalizations
A product of more than two factors
The product rule can be generalized to products of more than two factors. For example, for three factors we have
 ${\frac {d(uvw)}{dx}}={\frac {du}{dx}}vw+u{\frac {dv}{dx}}w+uv{\frac {dw}{dx}}$.
For a collection of functions $f_{1},\dots ,f_{k}$, we have
 ${\frac {d}{dx}}\left[\prod _{i=1}^{k}f_{i}(x)\right]=\sum _{i=1}^{k}\left({\frac {d}{dx}}f_{i}(x)\prod _{j\neq i}f_{j}(x)\right)=\left(\prod _{i=1}^{k}f_{i}(x)\right)\left(\sum _{i=1}^{k}{\frac {f'_{i}(x)}{f_{i}(x)}}\right).$
Higher derivatives
It can also be generalized to the general Leibniz rule for the nth derivative of a product of two factors, by symbolically expanding according to the binomial theorem:
 $d^{n}(uv)=\sum _{k=0}^{n}{n \choose k}\cdot d^{(nk)}(u)\cdot d^{(k)}(v).$
Applied at a specific point x, the above formula gives:
 $(uv)^{(n)}(x)=\sum _{k=0}^{n}{n \choose k}\cdot u^{(nk)}(x)\cdot v^{(k)}(x).$
Furthermore, for the nth derivative of an arbitrary number of factors:
 $\left(\prod _{i=1}^{k}f_{i}\right)^{(n)}=\sum _{j_{1}+j_{2}+...+j_{k}=n}{n \choose j_{1},j_{2},...,j_{k}}\prod _{i=1}^{k}f_{i}^{(j_{i})}.$
Higher partial derivatives
For partial derivatives, we have
 ${\partial ^{n} \over \partial x_{1}\,\cdots \,\partial x_{n}}(uv)=\sum _{S}{\partial ^{S}u \over \prod _{i\in S}\partial x_{i}}\cdot {\partial ^{nS}v \over \prod _{i\not \in S}\partial x_{i}}$
where the index S runs through the whole list of 2^{n} subsets of {1, ..., n}. For example, when n = 3, then
 ${\begin{aligned}&{}\quad {\partial ^{3} \over \partial x_{1}\,\partial x_{2}\,\partial x_{3}}(uv)\\\\&{}=u\cdot {\partial ^{3}v \over \partial x_{1}\,\partial x_{2}\,\partial x_{3}}+{\partial u \over \partial x_{1}}\cdot {\partial ^{2}v \over \partial x_{2}\,\partial x_{3}}+{\partial u \over \partial x_{2}}\cdot {\partial ^{2}v \over \partial x_{1}\,\partial x_{3}}+{\partial u \over \partial x_{3}}\cdot {\partial ^{2}v \over \partial x_{1}\,\partial x_{2}}\\\\&{}\qquad +{\partial ^{2}u \over \partial x_{1}\,\partial x_{2}}\cdot {\partial v \over \partial x_{3}}+{\partial ^{2}u \over \partial x_{1}\,\partial x_{3}}\cdot {\partial v \over \partial x_{2}}+{\partial ^{2}u \over \partial x_{2}\,\partial x_{3}}\cdot {\partial v \over \partial x_{1}}+{\partial ^{3}u \over \partial x_{1}\,\partial x_{2}\,\partial x_{3}}\cdot v.\end{aligned}}$
Banach space
Suppose X, Y, and Z are Banach spaces (which includes Euclidean space) and B : X × Y → Z is a continuous bilinear operator. Then B is differentiable, and its derivative at the point (x,y) in X × Y is the linear map D_{(x,y)}B : X × Y → Z given by
 $(D_{\left(x,y\right)}\,B)\left(u,v\right)=B\left(u,y\right)+B\left(x,v\right)\qquad \forall (u,v)\in X\times Y.$
Derivations in abstract algebra
In abstract algebra, the product rule is used to define what is called a derivation, not vice versa.
Vector functions
The product rule extends to scalar multiplication, dot products, and cross products of vector functions.
For scalar multiplication:
$(f\cdot {\mathbf {g}})'=f\;'\cdot {\mathbf {g}}+f\cdot {\mathbf {g}}\;'$
For dot products:
$({\mathbf {f}}\cdot {\mathbf {g}})'={\mathbf {f}}\;'\cdot {\mathbf {g}}+{\mathbf {f}}\cdot {\mathbf {g}}\;'$
For cross products:
$({\mathbf {f}}\times {\mathbf {g}})'={\mathbf {f}}\;'\times {\mathbf {g}}+{\mathbf {f}}\times {\mathbf {g}}\;'$
Note: cross products are not commutative, i.e. $(f\times g)'\neq f'\times g+g'\times f$, instead products are anticommutative, so it can be written as $(f\times g)'=f'\times gg'\times f$
Scalar fields
For scalar fields the concept of gradient is the analog of the derivative:
$\nabla (f\cdot g)=\nabla f\cdot g+f\cdot \nabla g$
Applications
Among the applications of the product rule is a proof that
 ${d \over dx}x^{n}=nx^{n1}$
when n is a positive integer (this rule is true even if n is not positive or is not an integer, but the proof of that must rely on other methods). The proof is by mathematical induction on the exponent n. If n = 0 then x^{n} is constant and nx^{n − 1} = 0. The rule holds in that case because the derivative of a constant function is 0. If the rule holds for any particular exponent n, then for the next value, n + 1, we have
 ${\begin{aligned}{d \over dx}x^{n+1}&{}={d \over dx}\left(x^{n}\cdot x\right)\\[12pt]&{}=x{d \over dx}x^{n}+x^{n}{d \over dx}x\qquad {\mbox{(the product rule is used here)}}\\[12pt]&{}=x\left(nx^{n1}\right)+x^{n}\cdot 1\qquad {\mbox{(the induction hypothesis is used here)}}\\[12pt]&{}=(n+1)x^{n}.\end{aligned}}$
Therefore if the proposition is true for n, it is true also for n + 1, and therefore for all natural n.
See also
References
 ^ Michelle Cirillo (August 2007). "Humanizing Calculus". The Mathematics Teacher. 101 (1): 23–27. (Subscription required (help)).
 Child, J. M. (2008) "The early mathematical manuscripts of Leibniz", Gottfried Wilhelm Leibniz, translated by J. M. Child; page 29, footnote 58.
External links