Consider the outomes $T,HT,HHT,HHH$. All but the last put you back to the start. So if $x$ is the expected number of tosses to get HHH we have $$x=\frac{1}{2}(x+1)+\frac{1}{4}(x+2)+\frac{1}{8}(x+3)+\frac{1}{8}3\ \ (*)$$ That is easy to solve for $x$ giving $$x=14$$
---------- Added 26 June 2016 ----------
Now let us consider this solution more carefully. Note first that the events $T,HT,HHT,HHH$ are disjoint and exhaustive. They have probabilities $\frac{1}{2},\frac{1}{4},\frac{1}{8},\frac{1}{8}$ respectively.
Let $3H$ be the random variable "a number of tosses until the first run of three $H$ is achieved". Now $(*)$ states: $$E(3H)=E(3H|T)p(T)+E(3H|HT)p(HT)+E(3H|HHT)p(HHT)+E(3H|HHH)p(HHH)$$ This is sometimes known as "computing expectations by conditioning" (see, for example, Sheldon Ross, Introduction to Probability Models 3.4 p100). It is often written more concisely as $$E(E(X|Y))=E(X)$$ where the outer expectation on the LHS is $E_Y(\cdot)$.
---Added 5-2-2022---
The expected number of tosses to obtain three consecutive heads given that the first toss is a tail equals one plus the expected number of tosses to obtain three consecutive heads (starting from that point). $$\mathsf E(3H\mid T) = 1+\mathsf E(3H)$$
This is the $(1+x)$ factor in the above. So too are the other terms evaluated.
It follows directly from the definitions of conditional probability and expectation. The discrete case is particularly straightforward.
It just comes down to what is sometimes called the "partition theorem": if $B_n$ is a partition of the sample space, then $E(X)=\sum_nE(X|B_n)p(B_n)$. Note that we have $$E(X|Y=y)=\sum_xxp(X=x|Y=y)=\sum_xx\frac{p(X=x\cap Y=y)}{p(Y=y)}$$ where the last equality is just the definition $$p(A|B)=\frac{p(A\cap B)}{p(B)}$$
Having written all that, I see that Wikipedia calls it the "Law of total expectation" and has an excellent article on it.