1

Suppose $f: \mathbb{R} \to \mathbb{R} $ is continuous at $x = 1 $ and $g: \mathbb{R} \to \mathbb{R} $ continuous at $y = f(1) $. Then $g \circ f $ is continuous at $x = 1 $

Attempt:

Let $\epsilon > 0$ be given. We can find $\delta', \delta'' $such that if $|x - 1| < \delta' $, then $|f(x) - f(1) | < \epsilon $ and we also have that $|f(z) - f(1) | < \delta'' $ implies $| g( f(z) ) - g( f(1)) | < \epsilon $

Isn't this just the proof? It is obvious that feel confused trying to write the proof nicely. IS this a correct proof ?

  • 2
    you need to be a bit careful in choosing your $\delta'$. –  Mar 27 '15 at 15:52
  • The steps are reversed. First pick $\delta''$ so that $|g(f(z))-g(f(1))| \lt \epsilon$, then choose $\delta'$ so that $|f(z)-f(1)|\lt \delta''$. – hardmath Dec 25 '17 at 21:28

1 Answers1

0

You are on the good track!

Let $\epsilon_1,\epsilon_2>0$ be arbitrary, by definition of continuity we have

  • $f$ continuous at $1$ implies that there exists $\delta_1 >0$ such that $$|x-1|<\delta_1 \implies |f(x)-f(1)|<\epsilon_1$$

  • $g$ continuous at $u$ implies that there exists $\delta_2>0$ such that $$|z-u|<\delta_2 \implies |g(z)-g(u)|<\epsilon_2$$.

Now, choose $\epsilon_1=\delta_2$ and merge the definitions.

Surb
  • 55,662