Exercise (1):
$\Leftarrow$: Assume $A$ was not injective, i.e. we had $v, v'$ with $Av = Av'$. Then $BAv = v = v' = BAv'$, which is a contradiction.
$\Rightarrow$: Define the linear map $B: Im(A) \to V$ by
$$B(w) \in A^{-1}(w)$$
for every $w \in Im(A)$. This definition is unique and well-defined since $A$ is injective.
$B$ is as the (effective) inverse of a linear map also linear.
Extend $B$ from the subspace $Im(A)$ to $W$ linearly. Then obviously $BAv = B(Av) = v$ by definition.
You can indeed extend $B$ linearly by noting that within vector spaces every subspace has a complement (e.g. see here): There is a subspace $M$, so that $W = Im(A) \oplus M$.
Now define $B': W \to V$ by $B'(w) = B'(x + y) := B(x) + 0$ with $x \in Im(A)$ and $y \in M$.
Alternative: Let $\{v_i | i \in I\}$ be a basis of $V$. Then $N := \{Av_i | i \in I\}$ is a set of linearly independent vectors as well:
$$0 = \sum_k \alpha_k Av_k \Rightarrow A^{-1}0 = A^{-1} \sum_k \alpha_k Av_k \Rightarrow 0 = \sum_k \alpha_k v_k \Rightarrow \forall k. \alpha_k = 0$$
Extend $N$ to a basis of $W$, namely $N \cup \{w_j | j \in J\}$ and define $B: W \to V$ using Ennar's suggestion
by its image on the basis elements: $B(Av_i) = v_i$ and $B(w_j) = 0$ (arbitrary).
Then $BAv = BA(\sum_k \alpha_k v_k) = \sum_k \alpha_k B(Av_k) = \sum_k \alpha_k v_k = v$ as required.
Exercise (2)
$\Leftarrow$: $AC = Id_W$ directly implies that $A$ is surjective.
$\Rightarrow$: Again, using Ennar's suggestion
, Define the linear map $C: W \to V$ by $$C(w_i) \in A^{-1}(w_i).$$
Then $ACw = AC(\sum \beta_j w_j) = \sum \beta_j ACw_j = \sum \beta_j w_j = w.$