Chapter 5 Continuity And Differentiability

CONTINUITY AND DIFFERENTIABILITYY

5.1 Introduction

This chapter is essentially a continuation of our study of differentiation of functions in Class XI. We had learnt to differentiate certain functions like polynomial functions and trigonometric functions. In this chapter, we introduce the very important concepts of continuity, differentiability and relations between them. We will also learn differentiation of inverse trigonometric functions. Further, we introduce a new class of functions called exponential and logarithmic functions. These functions lead to powerful techniques of differentiation. We illustrate certain geometrically obvious conditions through differential calculus. In the process, we will learn some fundamental theorems in this area.

5.2 Continuity

image

(1642-1727)

We start the section with two informal examples to get a feel of continuity. Consider the function

$ f(x)=\begin{cases}{l} 1, \text{ if } x \leq 0 \\ 2, \text{ if } x>0 \end{cases}. $ z This function is of course defined at every point of the real line. Graph of this function is given in the Fig 5.1. One can deduce from the graph that the value of the function at nearby points on $x$-axis remain close to each other except at $x=0$. At the points near and to the left of 0 , i.e., at points like $-0.1,-0.01,-0.001$, the value of the function is 1 . At the points near and to the right of 0 , i.e., at points like $0.1,0.01$,

image

Fig 5.1

0.001 , the value of the function is 2 . Using the language of left and right hand limits, we may say that the left (respectively right) hand limit of $f$ at 0 is 1 (respectively 2). In particular the left and right hand limits do not coincide. We also observe that the value of the function at $x=0$ concides with the left hand limit. Note that when we try to draw the graph, we cannot draw it in one stroke, i.e., without lifting pen from the plane of the paper, we can not draw the graph of this function. In fact, we need to lift the pen when we come to 0 from left. This is one instance of function being not continuous at $x=0$.

Now, consider the function defined as

$ f(x)=\begin{cases} & 1, \text{ if } x \neq 0 \\ & 2, \text{ if } x=0 \end{cases} $

This function is also defined at every point. Left and the right hand limits at $x=0$ are both equal to 1 . But the value of the function at $x=0$ equals 2 which does not coincide with the common value of the left and right hand limits. Again, we note that we cannot draw the graph of the function without lifting the pen. This is yet another instance of a function being not continuous at $x=0$.

Naively, we may say that a function is continuous at a fixed point if we can draw the graph of the function around that point without lifting the pen from the plane of the paper.

image

Fig 5.2

Mathematically, it may be phrased precisely as follows:

Definition 1 Suppose $f$ is a real function on a subset of the real numbers and let $c$ be a point in the domain of $f$. Then $f$ is continuous at $c$ if

$ \lim _{x \to c} f(x)=f(c) $

More elaborately, if the left hand limit, right hand limit and the value of the function at $x=c$ exist and equal to each other, then $f$ is said to be continuous at $x=c$. Recall that if the right hand and left hand limits at $x=c$ coincide, then we say that the common value is the limit of the function at $x=c$. Hence we may also rephrase the definition of continuity as follows: a function is continuous at $x=c$ if the function is defined at $x=c$ and if the value of the function at $x=c$ equals the limit of the function at $x=c$. If $f$ is not continuous at $c$, we say $f$ is discontinuous at $c$ and $c$ is called a point of discontinuity of $f$.

Definition 2 A real function $f$ is said to be continuous if it is continuous at every point in the domain of $f$.

This definition requires a bit of elaboration. Suppose $f$ is a function defined on a closed interval $[a, b]$, then for $f$ to be continuous, it needs to be continuous at every point in $[a, b]$ including the end points $a$ and $b$. Continuity of $f$ at $a$ means

$ \lim _{x \to a^{+}} f(x)=f(a) $

and continuity of $f$ at $b$ means

$ \lim _{x \to b^{-}} f(x)=f(b) $

Observe that $\lim _{x \to a^{-}} f(x)$ and $\lim _{x \to b^{+}} f(x)$ do not make sense. As a consequence of this definition, iff is defined only at one point, it is continuous there, i.e., if the domain of $f$ is a singleton, $f$ is a continuous function.

5.2.1 Algebra of continuous functions

In the previous class, after having understood the concept of limits, we learnt some algebra of limits. Analogously, now we will study some algebra of continuous functions. Since continuity of a function at a point is entirely dictated by the limit of the function at that point, it is reasonable to expect results analogous to the case of limits.

Theorem 1 Suppose $f$ and $g$ be two real functions continuous at a real number $c$. Then

(1) $f+g$ is continuous at $x=c$.

(2) $f-g$ is continuous at $x=c$.

(3) $f . g$ is continuous at $x=c$.

(4) $(\frac{f}{g})$ is continuous at $x=c$, (provided $g(c) \neq 0$).

Proof We are investigating continuity of $(f+g)$ at $x=c$. Clearly it is defined at $x=c$. We have

$ \begin{aligned} \lim _{x \to c}(f+g)(x) & =\lim _{x \to c}[f(x)+g(x)] & & \text{ (by definition of f+g)} \\ & =\lim _{x \to c} f(x)+\lim _{x \to c} g(x) & & \text{ (by the theorem on limits) } \\ & =f(c)+g(c) & & (\text{ as } f \text{ and } g \text{ are continuous }) \\ & =(f+g)(c) & & (\text{ by definition of } f+g) \end{aligned} $

Hence, $f+g$ is continuous at $x=c$.

Proofs for the remaining parts are similar and left as an exercise to the reader.

Remarks

(i) As a special case of (3) above, if $f$ is a constant function, i.e., $f(x)=\lambda$ for some real number $\lambda$, then the function $(\lambda . g)$ defined by $(\lambda . g)(x)=\lambda . g(x)$ is also continuous. In particular if $\lambda=-1$, the continuity of $f$ implies continuity of $-f$.

(ii) As a special case of (4) above, if $f$ is the constant function $f(x)=\lambda$, then the function $\frac{\lambda}{g}$ defined by $\frac{\lambda}{g}(x)=\frac{\lambda}{g(x)}$ is also continuous wherever $g(x) \neq 0$. In particular, the continuity of $g$ implies continuity of $\frac{1}{g}$.

The above theorem can be exploited to generate many continuous functions. They also aid in deciding if certain functions are continuous or not. The following examples illustrate this:

5.3 Differentiability

Recall the following facts from previous class. We had defined the derivative of a real function as follows:

Suppose $f$ is a real function and $c$ is a point in its domain. The derivative of $f$ at $c$ is defined by

$ \lim _{h \to 0} \frac{f(c+h)-f(c)}{h} $

provided this limit exists. Derivative of $f$ at $c$ is denoted by $f^{\prime}(c)$ or $.\frac{d}{d x}(f(x))| _{c}$. The function defined by

$ f^{\prime}(x)=\lim _{h \to 0} \frac{f(x+h)-f(x)}{h} $

wherever the limit exists is defined to be the derivative of $f$. The derivative of $f$ is denoted by $f^{\prime}(x)$ or $\frac{d}{d x}(f(x))$ or if $y=f(x)$ by $\frac{d y}{d x}$ or $y^{\prime}$. The process of finding derivative of a function is called differentiation. We also use the phrase differentiate $f(x)$ with respect to $x$ to mean find $f^{\prime}(x)$.

The following rules were established as a part of algebra of derivatives:

(1) $(u \pm v)^{\prime}=u^{\prime} \pm v^{\prime}$

(2) $(u v)^{\prime}=u^{\prime} v+u v^{\prime}$ (Leibnitz or product rule)

(3) $(\frac{u}{v})^{\prime}=\frac{u^{\prime} v-u v^{\prime}}{v^{2}}$, wherever $v \neq 0$ (Quotient rule).

The following table gives a list of derivatives of certain standard functions:

Table 5.3

f(x) $ x^n $ $ sin x $ $ cos x $ $ tanx $
f’(x) $ nx^{n-1}$ $ cos x $ $ -sin x $ sec^2 x

Whenever we defined derivative, we had put a caution provided the limit exists. Now the natural question is; what if it doesn’t? The question is quite pertinent and so is its answer. If $\lim _{h \to 0} \frac{f(c+h)-f(c)}{h}$ does not exist, we say that $f$ is not differentiable at $c$. In other words, we say that a function $f$ is differentiable at a point $c$ in its domain if both $\lim _{h \to 0^{-}} \frac{f(c+h)-f(c)}{h}$ and $\lim _{h \to 0^{+}} \frac{f(c+h)-f(c)}{h}$ are finite and equal. A function is said to be differentiable in an interval $[a, b]$ if it is differentiable at every point of $[a, b]$. As in case of continuity, at the end points $a$ and $b$, we take the right hand limit and left hand limit, which are nothing but left hand derivative and right hand derivative of the function at $a$ and $b$ respectively. Similarly, a function is said to be differentiable in an interval $(a, b)$ if it is differentiable at every point of $(a, b)$.

Theorem 3 If a function $f$ is differentiable at a point $c$, then it is also continuous at that point.

Proof Since $f$ is differentiable at $c$, we have

$ \begin{aligned} \lim _{x \to c} \frac{f(x)-f(c)}{x-c}=f^{\prime}(c) \end{aligned} $

But for $x \neq c$, we have

$ f(x)-f(c)=\frac{f(x)-f(c)}{x-c} .(x-c) $

Therefore

$ \begin{aligned} \lim _{x \to c}[f(x)-f(c)] & =\lim _{x \to c}[\frac{f(x)-f(c)}{x-c} \cdot(x-c)] \\ \lim _{x \to c}[f(x)]-\lim _{x \to c}[f(c)] & =\lim _{x \to c}[\frac{f(x)-f(c)}{x-c}] \cdot \lim _{x \to c}[(x-c)] \\ & =f^{\prime}(c) \cdot 0=0 \\ \lim _{x \to c} f(x) & =f(c) \end{aligned} $

or

Hence $f$ is continuous at $x=c$.

Corollary 1 Every differentiable function is continuous.

We remark that the converse of the above statement is not true. Indeed we have seen that the function defined by $f(x)=|x|$ is a continuous function. Consider the left hand limit

$ \begin{aligned} \lim _{h \to 0^{-}} \frac{f(0+h)-f(0)}{h}=\frac{-h}{h}=-1 \end{aligned} $

The right hand limit

$ \begin{aligned} \lim _{h \to 0^{+}} \frac{f(0+h)-f(0)}{h}=\frac{h}{h}=1 \end{aligned} $

Since the above left and right hand limits at 0 are not equal, $\lim _{h \to 0} \frac{f(0+h)-f(0)}{h}$ does not exist and hence $f$ is not differentiable at 0 . Thus $f$ is not a differentiable function.

5.3.1 Derivatives of composite functions

To study derivative of composite functions, we start with an illustrative example. Say, we want to find the derivative of $f$, where

$ f(x)=(2 x+1)^{3} $

One way is to expand $(2 x+1)^{3}$ using binomial theorem and find the derivative as a polynomial function as illustrated below.

$ \begin{aligned} \frac{d}{d x} f(x) & =\frac{d}{d x}[(2 x+1)^{3}] \\ & =\frac{d}{d x}(8 x^{3}+12 x^{2}+6 x+1) \\ & =24 x^{2}+24 x+6 \\ & =6(2 x+1)^{2} \end{aligned} $

Now, observe that

$ f(x)=(h \circ g)(x) $

where $g(x)=2 x+1$ and $h(x)=x^{3}$. Put $t=g(x)=2 x+1$. Then $f(x)=h(t)=t^{3}$. Thus

$ \frac{d f}{d x}=6(2 x+1)^{2}=3(2 x+1)^{2} \cdot 2=3 t^{2} \cdot 2=\frac{d h}{d t} \cdot \frac{d t}{d x} $

The advantage with such observation is that it simplifies the calculation in finding the derivative of, say, $(2 x+1)^{100}$. We may formalise this observation in the following theorem called the chain rule.

Theorem 4 (Chain Rule) Let $f$ be a real valued function which is a composite of two functions $u$ and $v$; i.e., $f=v$ o $u$. Suppose $t=u(x)$ and if both $\frac{d t}{d x}$ and $\frac{d v}{d t}$ exist, we have

$ \frac{d f}{d x}=\frac{d v}{d t} \cdot \frac{d t}{d x} $

We skip the proof of this theorem. Chain rule may be extended as follows. Suppose $f$ is a real valued function which is a composite of three functions $u, v$ and $w$; i.e.,

$f=(w \circ \quad u) \circ v$. If $t=v(x)$ and $s=u(t)$, then

$ \frac{d f}{d x}=\frac{d(w o u)}{d t} \cdot \frac{d t}{d x}=\frac{d w}{d s} \cdot \frac{d s}{d t} \cdot \frac{d t}{d x} $

provided all the derivatives in the statement exist. Reader is invited to formulate chain rule for composite of more functions.

5.3.2 Derivatives of implicit functions

Until now we have been differentiating various functions given in the form $y=f(x)$. But it is not** necessary that functions are always expressed in this form. For example, consider one of the following relationships between $x$ and $y$ :

$ \begin{aligned} x-y-\pi & =0 \\ x+\sin x y-y & =0 \end{aligned} $

In the first case, we can solve for $y$ and rewrite the relationship as $y=x-\pi$. In the second case, it does not seem that there is an easy way to solve for $y$. Nevertheless, there is no doubt about the dependence of $y$ on $x$ in either of the cases. When a relationship between $x$ and $y$ is expressed in a way that it is easy to solve for $y$ and write $y=f(x)$, we say that $y$ is given as an explicit function of $x$. In the latter case it is implicit that $y$ is a function of $x$ and we say that the relationship of the second type, above, gives function implicitly. In this subsection, we learn to differentiate implicit functions.

5.3.3 Derivatives of inverse trigonometric functions

We remark that inverse trigonometric functions are continuous functions, but we will not prove this. Now we use chain rule to find derivatives of these functions.

5.4 Exponential and Logarithmic Functions

Till now we have learnt some aspects of different classes of functions like polynomial functions, rational functions and trigonometric functions. In this section, we shall learn about a new class of (related) functions called exponential functions and logarithmic functions. It needs to be emphasized that many statements made in this section are motivational and precise proofs of these are well beyond the scope of this text.

The Fig 5.9 gives a sketch of $y=f_1(x)=x, y=f_2(x)=x^{2}, y=f_3(x)=x^{3}$ and $y=f_4(x)$ $=x^{4}$. Observe that the curves get steeper as the power of $x$ increases. Steeper the curve, faster is the rate of growth. What this means is that for a fixed increment in the value of $x(>1)$, the increment in the value of $y=f_n(x)$ increases as $n$ increases for $n$ $=1,2,3,4$. It is conceivable that such a statement is true for all positive values of $n$, where $f_n(x)=x^{n}$. Essentially, this means that the graph of $y=f_n(x)$ leans more towards the $y$-axis as $n$ increases. For example, consider $f _{10}(x)=x^{10}$ and $f _{15}(x)$ $=x^{15}$. If $x$ increases from 1 to $2, f _{10}$ increases from 1 to $2^{10}$ whereas $f _{15}$ increases from 1 to $2^{15}$. Thus, for the same increment in $x, f _{15}$ grow faster than $f _{10}$.

Upshot of the above discussion is that the growth of polynomial functions is dependent on the degree of the polynomial function - higher the degree, greater is the growth. The next natural question is:

image

Fig 5.9 Is there a function which grows faster than any polynomial function. The answer is in affirmative** and an example of such a function is

$ y=f(x)=10^{x} . $

Our claim is that this function $f$ grows faster than $f_n(x)=x^{n}$ for any positive integer $n$. For example, we can prove that $10^{x}$ grows faster than $f _{100}(x)=x^{100}$. For large values of $x$ like $x=10^{3}$, note that $f _{100}(x)=(10^{3})^{100}=10^{300}$ whereas $f(10^{3})=10^{10^{3}}=10^{1000}$. Clearly $f(x)$ is much greater than $f _{100}(x)$. It is not difficult to prove that for all $x>10^{3}, f(x)>f _{100}(x)$. But we will not attempt to give a proof of this here. Similarly, by choosing large values of $x$, one can verify that $f(x)$ grows faster than $f_n(x)$ for any positive integer $n$.

Definition 3 The exponential function with positive base $b>1$ is the function

$ y=f(x)=b^{x} $

The graph of $y=10^{x}$ is given in the Fig 5.9.

It is advised that the reader plots this graph for particular values of $b$ like 2,3 and 4. Following are some of the salient features of the exponential functions:

(1) Domain of the exponential function is $\mathbf{R}$, the set of all real numbers.

(2) Range of the exponential function is the set of all positive real numbers.

(3) The point $(0,1)$ is always on the graph of the exponential function (this is a restatement of the fact that $b^{0}=1$ for any real $b>1$ ).

(4) Exponential function is ever increasing; i.e., as we move from left to right, the graph rises above.

(5) For very large negative values of $x$, the exponential function is very close to 0 . In other words, in the second quadrant, the graph approaches $x$-axis (but never meets it).

Exponential function with base 10 is called the common exponential function. In the Appendix A.1.4 of Class XI, it was observed that the sum of the series

$ 1+\frac{1}{1 !}+\frac{1}{2 !}+\ldots $

is a number between 2 and 3 and is denoted by $e$. Using this $e$ as the base we obtain an extremely important exponential function $y=e^{x}$.

This is called natural exponential function.

It would be interesting to know if the inverse of the exponential function exists and has nice interpretation. This search motivates the following definition.

Definition 4 Let $b>1$ be a real number. Then we say logarithm of $a$ to base $b$ is $x$ if $b^{x}=a$.

Logarithm of $a$ to base $b$ is denoted by $\log _{b} a$. Thus $\log _{b} a=x$ if $b^{x}=a$. Let us** work with a few explicit examples to get a feel for this. We know $2^{3}=8$. In terms of logarithms, we may rewrite this as $\log _2 8=3$. Similarly, $10^{4}=10000$ is equivalent to saying $\log _{10} 10000=4$. Also, $625=5^{4}=25^{2}$ is equivalent to saying $\log _5 625=4$ or $\log _{25} 625=2$.

On a slightly more mature note, fixing a base $b>1$, we may look at logarithm as a function from positive real numbers to all real numbers. This function, called the logarithmic function, is defined by

$ \begin{aligned} \log _{b}: \mathbf{R}^{+} & \to \mathbf{R} \\ x & \to \log _{b} x=y \text{ if } b^{y}=x \end{aligned} $

As before if the base $b=10$, we say it is common logarithms and if $b=e$, then we say it is natural logarithms. Often natural logarithm is denoted by $\ln$. In this chapter, $\log x$ denotes the logarithm function to base e, i.e., $\ln x$ will be written as simply $\log x$. The Fig 5.10 gives the plots of logarithm function to base 2, $e$ and 10.

Some of the important observations about the logarithm function to any base $b>1$ are listed below:

image

Fig 5.10

(1) We cannot make a meaningful definition of logarithm of non-positive numbers and hence the domain of log function is $\mathbf{R}^{+}$.

(2) The range of log function is the set of all real numbers.

(3) The point $(1,0)$ is always on the graph of the log function.

(4) The log function is ever increasing, i.e., as we move from left to right the graph rises above.

(5) For $x$ very near to zero, the value of $\log x$ can be made lesser than any given real number. In other words in the fourth quadrant the graph approaches $y$-axis (but never meets it).

(6) Fig 5.11 gives the plot of $y=e^{x}$ and $y=\ln x$. It is of interest to observe that the two curves are the mirror

image

Fig 5.11 images of each other reflected in the line $y=x$.

Two properties of ’ $\log$ ’ functions are proved below:

(1) There is a standard change of base rule to obtain $\log _{a} p$ in terms of $\log _{b} p$. Let $\log _{a} p=\alpha, \log _{b} p=\beta$ and $\log _{b} a=\gamma$. This means $a^{\alpha}=p, b^{\beta}=p$ and $b^{\gamma}=a$.

Substituting the third equation in the first one, we have

$ (b^{\gamma})^{\alpha}=b^{\gamma \alpha}=p $

Using this in the second equation, we get

$ b^{\beta}=p=b^{\gamma \alpha} $

which implies $\quad \beta=\alpha \gamma$ or $\alpha=\frac{\beta}{\gamma}$. But then

$ \log _{a} p=\frac{\log _{b} p}{\log _{b} a} $

(2) Another interesting property of the log function is its effect on products. Let $\log _{b} p q=\alpha$. Then $b^{\alpha}=p q$. If $\log _{b} p=\beta$ and $\log _{b} q=\gamma$, then $b^{\beta}=p$ and $b^{\gamma}=q$. But then $b^{\alpha}=p q=b^{\beta} b^{\gamma}=b^{\beta+\gamma}$

which implies $\alpha=\beta+\gamma$, i.e.,

$ \log _{b} p q=\log _{b} p+\log _{b} q $

A particularly interesting and important consequence of this is when $p=q$. In this case the above may be rewritten as

$ \log _{b} p^{2}=\log _{b} p+\log _{b} p=2 \log p $

An easy generalisation of this (left as an exercise!) is

$ \log _{b} p^{n}=n \log p $

for any positive integer $n$. In fact this is true for any real number $n$, but we will not attempt to prove this. On the similar lines the reader is invited to verify

$ \log _{b} \frac{x}{y}=\log _{b} x-\log _{b} y $

5.5 Logarithmic Differentiation

In this section, we will learn to differentiate certain special class of functions given in the form

$ y=f(x)=[u(x)]^{v(x)} $

By taking logarithm (to base $e$ ) the above may be rewritten as

$ \log y=v(x) \log [u(x)] $

Using chain rule we may differentiate this to get

$ \frac{1}{y} \cdot \frac{d y}{d x}=v(x) \cdot \frac{1}{u(x)} \cdot u^{\prime}(x)+v^{\prime}(x) \cdot \log [u(x)] $

which implies that

$ \frac{d y}{d x}=y[\frac{v(x)}{u(x)} \cdot u^{\prime}(x)+v^{\prime}(x) \cdot \log [u(x)]] $

The main point to be noted in this method is that $f(x)$ and $u(x)$ must always be positive as otherwise their logarithms are not defined. This process of differentiation is known as logarithms differentiation and is illustrated by the following examples:

5.6 Derivatives of Functions in Parametric Forms

Sometimes the relation between two variables is neither explicit nor implicit, but some link of a third variable with each of the two variables, separately, establishes a relation between the first two variables. In such a situation, we say that the relation between them is expressed via a third variable. The third variable is called the parameter. More precisely, a relation expressed between two variables $x$ and $y$ in the form $x=f(t), y=g(t)$ is said to be parametric form with $t$ as a parameter.

In order to find derivative of function in such form, we have by chain rule.

$ \frac{d y}{d t}=\frac{d y}{d x} \cdot \frac{d x}{d t} $

or

$ \frac{d y}{d x}=\frac{\frac{d y}{d t}}{\frac{d x}{d t}}(\text{ whenever } \frac{d x}{d t} \neq 0) $

Thus

$ .\frac{d y}{d x}=\frac{g^{\prime}(t)}{f^{\prime}(t)}(\text{ as } \frac{d y}{d t}=g^{\prime}(t) \text{ and } \frac{d x}{d t}=f^{\prime}(t)) [\text{ provided } f^{\prime}(t) \neq 0] $

5.7 Second Order Derivative

Let

$ y=f(x) \text{. Then } $

$ \frac{d y}{d x}=f^{\prime}(x) $

If $f^{\prime}(x)$ is differentiable, we may differentiate (1) again w.r.t. $x$. Then, the left hand side becomes $\frac{d}{d x}(\frac{d y}{d x})$ which is called the second order derivative of $y$ w.r.t. $x$ and is denoted by $\frac{d^{2} y}{d x^{2}}$. The second order derivative of $f(x)$ is denoted by $f^{\prime \prime}(x)$. It is also denoted by $D^{2} y$ or $y^{\prime \prime}$ or $y_2$ if $y=f(x)$. We remark that higher order derivatives may be defined similarly.

5.8 Mean Value Theorem

In this section, we will state two fundamental results in Calculus without proof. We shall also learn the geometric interpretation of these theorems. Theorem 6 (Rolle’s Theorem) Let $f:[a, b] \rightarrow \mathbf{R}$ be continuous on $[a, b]$ and differentiable on $(a, b)$, such that $f(a)=f(b)$, where $a$ and $b$ are some real numbers. Then there exists some $c$ in $(a, b)$ such that $f^{\prime}(c)=0$.

In Fig 5.12 and 5.13, graphs of a few typical differentiable functions satisfying the hypothesis of Rolle’s theorem are given.

image

Fig 5.12

image

Fig 5.13

Observe what happens to the slope of the tangent to the curve at various points between $a$ and $b$. In each of the graphs, the slope becomes zero at least at one point. That is precisely the claim of the Rolle’s theorem as the slope of the tangent at any point on the graph of $y=f(x)$ is nothing but the derivative of $f(x)$ at that point.

Theorem 7 (Mean Value Theorem) Let $f:[a, b] \rightarrow \mathbf{R}$ be a continuous function on $[a, b]$ and differentiable on $(a, b)$. Then there exists some $c$ in $(a, b)$ such that $$ f^{\prime}(c)=\frac{f(b)-f(a)}{b-a} $$

Observe that the Mean Value Theorem (MVT) is an extension of Rolle’s theorem. Let us now understand a geometric interpretation of the MVT. The graph of a function $y=f(x)$ is given in the Fig 5.14. We have already interpreted $f^{\prime}(c)$ as the slope of the tangent to the curve $y=f(x)$ at $(c, f(c))$. From the Fig 5.14 it is clear that $\frac{f(b)-f(a)}{b-a}$ is the slope of the secant drawn between $(a, f(a))$ and $(b, f(b))$. The MVT states that there is a point $c$ in $(a, b)$ such that the slope of the tangent at $(c, f(c))$ is same as the slope of the secant between $(a, f(a))$ and $(b, f(b))$. In other words, there is a point $c$ in $(a, b)$ such that the tangent at $(c, f(c))$ is parallel to the secant between $(a, f(a))$ and $(b, f(b))$.

image

Summary

A real valued function is continuous at a point in its domain if the limit of the function at that point equals the value of the function at that point. A function is continuous if it is continuous on the whole of its domain.

  • Sum, difference, product and quotient of continuous functions are continuous. i.e., if $f$ and $g$ are continuous functions, then

$(f \pm g)(x)=f(x) \pm g(x)$ is continuous.

$(f . g)(x)=f(x) \cdot g(x)$ is continuous.

$(\frac{f}{g})(x)=\frac{f(x)}{g(x)}$ (wherever $.g(x) \neq 0)$ is continuous.

  • Every differentiable function is continuous, but the converse is not true.
  • Chain rule is rule to differentiate composites of functions. If $f=v$ o $u, t=u(x)$ and if both $\frac{d t}{d x}$ and $\frac{d v}{d t}$ exist then

$ \frac{d f}{d x}=\frac{d v}{d t} \cdot \frac{d t}{d x} $

Following are some of the standard derivatives (in appropriate domains):

$ \begin{matrix} \frac{d}{d x}(\sin ^{-1} x)=\frac{1}{\sqrt{1-x^{2}}} & \frac{d}{d x}(\cos ^{-1} x)=-\frac{1}{\sqrt{1-x^{2}}} \\ \frac{d}{d x}(\tan ^{-1} x)=\frac{1}{1+x^{2}} & \frac{d}{d x}(\log x)=\frac{1}{x} \end{matrix} $

Logarithmic differentiation is a powerful technique to differentiate functions of the form $f(x)=[u(x)]^{v(x)}$. Here both $f(x)$ and $u(x)$ need to be positive for this technique to make sense.



Table of Contents