World Library  
Flag as Inappropriate
Email this Article

Sum rule in differentiation

Article Id: WHEBN0000147955
Reproduction Date:

Title: Sum rule in differentiation  
Author: World Heritage Encyclopedia
Language: English
Subject: Linearity of differentiation, Mean value theorem, Calculus, Sum rule in integration, Continuous function
Collection:
Publisher: World Heritage Encyclopedia
Publication
Date:
 

Sum rule in differentiation

In calculus, the sum rule in differentiation is a method of finding the derivative of a function that is the sum of two other functions for which derivatives exist. This is a part of the linearity of differentiation. The sum rule in integration follows from it. The rule itself is a direct consequence of differentiation from first principles.

The sum rule tells us that for two functions u and v:

\frac{d}{dx}(u + v)=\frac{du}{dx}+\frac{dv}{dx}

This rule also applies to subtraction and to additions and subtractions of more than two functions

\frac{d}{dx}(u + v + w + \dots)=\frac{du}{dx}+\frac{dv}{dx}+\frac{dw}{dx}+\cdots

Proof

Simple Proof

Let h(x) = f(x) + g(x), and suppose that f and g are each differentiable at x. We want to prove that h is differentiable at x and that its derivative h'(x) is given by f'(x)+g'(x).

h'(x) = \lim_{a\to 0} \frac{h(x+a)-h(x)}{a}

= \lim_{a\to 0} \frac{a}
= \lim_{a\to 0} \frac{f(x+a)-f(x)+g(x+a)-g(x)}{a}
= \lim_{a\to 0} \frac{f(x+a)-f(x)}{a} + \lim_{a\to 0} \frac{g(x+a)-g(x)}{a}
= f'(x)+g'(x).

More Complicated Proof

Let y be a function given by the sum of two functions u and v, such that:

y = u + v \,

Now let y, u and v be increased by small increases Δy, Δu and Δv respectively. Hence:

y + \Delta{y} = (u + \Delta{u}) + (v + \Delta{v}) = u + v + \Delta{u} + \Delta{v} = y + \Delta{u} + \Delta{v}. \,

So:

\Delta{y} = \Delta{u} + \Delta{v}. \,

Now divide throughout by Δx:

\frac{\Delta{y}}{\Delta{x}} = \frac{\Delta{u}}{\Delta{x}} + \frac{\Delta{v}}{\Delta{x}}.

Let Δx tend to 0:

\frac{dy}{dx} = \frac{du}{dx} + \frac{dv}{dx}.

Now recall that y = u + v, giving the sum rule in differentiation:

\frac{d}{dx}\left(u + v\right) = \frac{du}{dx} + \frac{dv}{dx} .

The rule can be extended to subtraction, as follows:

\frac{d}{dx}\left(u - v\right) = \frac{d}{dx}\left(u + (-v)\right) = \frac{du}{dx} + \frac{d}{dx}\left(-v\right).

Now use the special case of the constant factor rule in differentiation with k=−1 to obtain:

\frac{d}{dx}\left(u - v\right) = \frac{du}{dx} + \left(-\frac{dv}{dx}\right) = \frac{du}{dx} - \frac{dv}{dx}.

Therefore, the sum rule can be extended so it "accepts" addition and subtraction as follows:

\frac{d}{dx}\left(u \pm v\right) = \frac{du}{dx} \pm \frac{dv}{dx}.

The sum rule in differentiation can be used as part of the derivation for both the sum rule in integration and linearity of differentiation.

Generalization to finite sums

Consider a set of functions f1, f2,..., fn. Then

\frac{d}{dx} \left(\sum_{1 \le i \le n} f_i(x)\right) = \frac{d}{dx}\left(f_1(x) + f_2(x) + \cdots + f_n(x)\right) = \frac{d}{dx}f_1(x) + \frac{d}{dx}f_2(x) + \cdots + \frac{d}{dx}f_n(x)

so

\frac{d}{dx} \left(\sum_{1 \le i \le n} f_i(x)\right) = \sum_{1 \le i \le n} \left(\frac{d}{dx}f_i(x)\right) .

In other words, the derivative of any finite sum of functions is the sum of the derivatives of those functions.

This follows easily by induction; we have just proven this to be true for n = 2. Assume it is true for all n < k, then define

g(x)=\sum_{i=1}^{k-1} f_i(x).

Then

\sum_{i=1}^k f_i(x)=g(x)+f_k(x)

and it follows from the proof above that

\frac{d}{dx} \left(\sum_{i=1}^k f_i(x)\right) = \frac{d}{dx}g(x)+\frac{d}{dx}f_k(x).

By the inductive hypothesis,

\frac{d}{dx}g(x)=\frac{d}{dx} \left(\sum_{i=1}^{k-1} f_i(x)\right)=\sum_{i=1}^{k-1} \frac{d}{dx}f_i(x)

so

\frac{d}{dx} \left(\sum_{i=1}^k f_i(x) \right) = \sum_{i=1}^{k-1} \frac{d}{dx}f_i(x) + \frac{d}{dx}f_k(x)=\sum_{i=1}^k \frac{d}{dx}f_i(x)

which ends the proof of the sum rule of differentiation.

Note this does not automatically extend to infinite sums. An intuitive reason for why things can go wrong is that there is more than one limit involved (specifically, one for the sum and one in the definition of the derivative). Uniform convergence deals with these sorts of issues.

References

  • Gilbert Strang: Calculus. SIAM 1991, ISBN 0-9614088-2-0, p. 71 (restricted online version (google books))
  • sum rule at PlanetMath
This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and USA.gov, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for USA.gov and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
 
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
 
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.
 



Copyright © World Library Foundation. All rights reserved. eBooks from World eBook Library are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.