Answer to Question #188884 in Real Analysis for TUHIN SUBHRA DAS

Question #188884

Let f be a differentiable function on [a,b ] and x belongs to[a,b]. Show that, if f'(x)=0 and f''(x)>0, then f must have a local maximum at x.


1
Expert's answer
2021-05-07T13:39:02-0400

To prove: If "f'(x)=0" and "f''(x)>0" then "(x,f(x))" is local minimum of "f"

Proof:

If a function f is continuous on the interval [a, b] and "x\\in (a,b)" and f(x)>b where b is some constant real number , then for all c near x, f(c)>b

Assume this theorem were not true. Then the intermediate value theorem would not hold

and similarly for the case f(x)<a


Let f be a function on [a, b] and let "f''(x\\in (a,b))>0" Then by above statement it must be that for some open interval around x containing x, f''(x)>0


If f''(x)>0 for some interval around x containing x, and f'(x)=0, then for all c in the interval,

"c<x\\Rightarrow f'(c)<0\\\\c>x\\Rightarrow f'(c)>0"

because f'(x) is an increasing function in the interval

We now have that x is a local minimum by the definition of local minimum and can also be considered the variant for x being a local maximum.


Need a fast expert's response?

Submit order

and get a quick answer at the best price

for any assignment or question with DETAILED EXPLANATIONS!

Comments

No comments. Be the first!

Leave a comment

LATEST TUTORIALS
APPROVED BY CLIENTS