(formerly partial answer. now full answer, I hope).
To prove 1.1:
There's another inequality to use: We have for any $b \in \mathbb R$ that $$1+b \le e^b \tag{A}$$
By $(0)$, $M_X(a) \le 1+(e^a-1)E[X]$. By $(A)$ with $b=(e^a-1)E[X]$, we get $1+(e^a-1)E[X] \le e^{E[X](e^a-1)}$.
Note: I actually got the inequality $(A)$ from here: https://en.wikipedia.org/wiki/Moment-generating_function#Other_properties --> in the part with 'This follows from the simple inequality'.
To prove 1.2 from 1.1:
By independence (independence of $X_i$ implies independence of $e^{aX_i}$ because $f(x)=e^{ax}, f: \mathbb R \to \mathbb R$ is measurable or something. see here or here or my new question here...or use definition of independence in terms of joint density splitting up or something),
$$\mathbb{E}[e^{a(\sum_i X_i)} = \prod_i e^{aX_i}] = \prod_i E[e^{aX_i}]$$
Then 1.1 says $E[e^{aX_i}] \le e^{\mathbb{E}[X_i](e^a-1)}$. Hence, from non-negativity of each $E[e^{aX_i}]$,
$$\mathbb{E}[e^{a(\sum_i X_i)}] \le \prod_i e^{\mathbb{E}[X_i](e^a-1)} = e^{\sum_i \mathbb{E}[X_i](e^a-1)} = e^{E[S](e^a-1)}$$
For 2 (attempt 1):
Not sure Markov's inequality is helpful, but...
Markov's inequality, assuming applicable, gives
$$\mathbb{P}(S\geq (1 \pm t)\mathbb{E}[S])\le \frac{E[S]}{(1 \pm t)\mathbb{E}[S]} = \frac{1}{1 \pm t}$$
Here, Markov's inequality is applicable because $S$, $1 \pm t$ and $\mathbb{E}[S]$ are non-negative, BUT IF $\mathbb{E}[S]$ is nonzero. At this point, I guess let's use that non-negative random variables have zero mean if and only if they are almost surely zero.
For 2 (attempt 2):
I think Markov's inequality is helpful but not for $P(S \ge (1 \pm t)\mathbb{E}[S])$ directly but rather for $P(S \ge (1 \pm t)\mathbb{E}[S]) = P(e^{mS} \ge e^{m[(1 \pm t)\mathbb{E}[S])]})$, for any $m > 0$. We get
$$P(S \ge (1 \pm t)\mathbb{E}[S]) \le e^{-m((1 \pm t)\mathbb{E}[S])} M_S(m) \tag{B}$$
based on (see here) that for any random variable $Z$ (doesn't have to be non-negative) and for any $z$ (doesn't have to be positive) and for any $m > 0$
$$P(Z \ge z) \le e^{-zm} M_Z(m) := e^{-zm} E[e^{mZ}]$$
I think I got it for (2.1):
In $(B)$, choose $m=\ln(1+t)$, which is indeed positive for $0 < t$ (not sure where $t < 1$ is used. maybe this is used more in (2.2)) and then use $(0)$ on $E[e^{mS}]$ with $a=m$:
$$P(S \ge (1 + t)\mathbb{E}[S]) \le e^{(e^m-1-m-mt)E[S]}$$
and then $e^{(e^m-1-m-mt)E[S]}$ is indeed $\le (\frac{e^t}{(1+t)^{1+t}})^{\mathbb{E}[S]}$, which (I hope!) is true if and only if $(e^m-1-m-mt)E[S] \le \mathbb{E}[S] \ln(\frac{e^t}{(1+t)^{1+t}})$, which is true if and only if $e^m-1-m+mt \le \ln(\frac{e^t}{(1+t)^{1+t}} = t - (1+t)\ln(1+t)$
Similarly for 2.2: (maybe related: Borel-Cantelli-related exercise: Show that $\sum_{n=1}^{\infty} p_n < 1 \implies \prod_{n=1}^{\infty} (1-p_n) \geq 1- S$.)
Do $m=ln(1-t)$. Now this uses $t < 1$ (not sure if $0 < t$ is used). Instead of having to prove '$e^m-1-m-mt \le t - (1+t)\ln(1+t)$', we have to prove...either
$e^m-1-m+mt \le - t \pm (1-t)\ln(1-t)$. You double check which of the $\pm$ but regardless I believe we have
$e^m-1-m+mt \le - t - (1-t)\ln(1-t)$. Finally, in case it's the $+$, just use that $- t - (1-t)\ln(1-t) \le - t + (1-t)\ln(1-t)$.
Remark: Remember to prove Markov's inequality, if you haven't in class!