Bear in mind the multiplication in $G$ is the same as the multiplication of the reals so whatever is true about multiplication of $\mathbb R$ will still be true of the multiplication of $G$.
Multiplication is associative. We don't have to prove that.
We do have to prove that mulitplicaition is closed on $G$. That is: if $w \in G$ and $u \in G$ then $wu \in G$. (which you did).
The identity in $\mathbb R$ is $1$ so the identity for $G$ will be $1$. We don't have to prove that $1*w = w$ for all $w\in G$ and we don't have to solve that if $e*w = w$ then $e=1$. We know that must be true. The only thing we have to prove is that $1$ is actually a member of $G$. (Which as $1 = 1 + 0\sqrt 2$ it obviously is.
And as the invers of $x$ is $\frac 1x$, we do know that the inverse of $w \in G$ will HAVE to be $\frac 1w$. The issue though is we have to prove that if $w \in G$ then $\frac 1w \in G$ also.
SO let's prove that:
If $w = a+ b\sqrt 2\ne 0$ then $\frac 1w = \frac 1{a+b\sqrt 2}$. And that is in $G$???? We can write $\frac 1{a+b\sqrt 2}$ as $c +d \sqrt 2$ where $c$ and $d$ is rational? Is that actually true? How would we show that? How would we figure out what $c,d $ are?
Well, the hint "rationalize the denominator" is very apt.
$\frac 1{a+b\sqrt 2} = \frac 1{a+b\sqrt 2}\cdot \frac {a-b\sqrt 2}{a-b\sqrt 2}=$
$\frac {a-b\sqrt 2}{a^2 -2b^2 } =\frac a{a^2 -2 b^2}- \frac b{a^2 -2b^2}\sqrt 2$
And as $a,b \in \mathbb Q$ we have $\frac a{a^2 -2b^2},-\frac b{a^2 -2b^2}\in \mathbb Q$. So $\frac 1{a+b\sqrt 2} = \frac a{a^2 -2 b^2}- \frac b{a^2 -2b^2}\sqrt 2\in G$.
And that's that.
=====
Actually $ac + 2bd = 1$ and $ad + bc = 0$ is a perfectly fine set up.
$d = -\frac {bc}a$ (assuming $a \ne 0$)
$ac + 2b(-\frac {bc}a) = ac- \frac {2b^2c}a =2$
$c (a-\frac {2b^2}a) = 1$
$c = \frac 1{a-\frac{2b^2}a}=\frac {a}{a^2 -2b^2}$ (assuming $a-\frac{2b^2}a\ne 0$.... which it cant because $(a-\frac{2b^2}a)c = 1 \ne 0$.)
So $d =\frac {b \frac {1a}{a^2 -2b^2}}a= \frac {b}{a^2 - 2b^2}$
....
And if $a$ does equal $0$ we have
$ac + 2bd = 2bd =1$ and so $b\ne 0$ and $ad +bc = bc = 0$.
Now $b \ne 0$ as 1) $2bd =1 \ne 0$ and also because we are told $a + b\sqrt 2 \ne 0$ and if $a =0$ then....
SO $b\ne 0$ and $bc = 0$ so $c = 0$.
And $2bd= 1$ so $b = \frac 1{2b}$.
... in other words $b\sqrt 2\cdot x = 1 \implies x = \frac 1{2b}\sqrt 2$