Big gap! You implicitly assume that both equations will share two roots. This is the crux of the matter and it requires rigorous proof! (the accepted answer has the same gap). Here is one way.
Hint $ $ Call them $\,f(x)\,$ and $\,g(x).\,$ A common root is a also a common root of the polynomial $\,h\,$ obtained by eliminating their quadratic terms: $\,h = f-(a/2) g.\,$ Since the discriminant of $\,g\,$ is negative, both roots of $g$ are non-rational. So the only way either can be a root of the polynomial $\,h\,$ of degree $\le 1\,$ is if $\,h = 0.\,$ Thus $\,f = (a/2) g\,$ is a constant multiple of $\,g.\,$ The rest is straightforward.
Remark $\ $ More generally any common root of $\,f,g\,$ is also a root of every one of their linear combinations $\, h_1 f + h_2 g.\,$ But, by Bezout, we know that $\,\gcd(f,g)\,$ has that form, hence $\,f(a) = 0 = g(a)\,\Rightarrow\, \gcd(f,g)(a) = 0.$
More importantly, two quadratic equations $ax^2+bx+c=0$ and $a'x^2+b'x+c'=0$ have the same solutions if and only if $a'=ka$, $b'=kb$ and $c'=kc$ for some nonzero real number $k$. Here, it is important that $a$, $b$ and $c$ are integers to conclude that the minimum of $a+b+c$ exists and is obtained for $a=2$, $b=3$ and $c=4$ (if $a$,$b$,and $c$ where rational numbers, then the minimum does not exist).
– Taladris Jan 13 '14 at 12:52