Surely, $L_n$ cannot grow exponentially, as it does not exceed $n$; this is just a weird mistake. But the second guess is correct: the order of $L_n$ is exactly $n$. However, $L_n/n$ converges not to a constant, but to a distribution. A quick way to see this is to remark that $L_n = n$ with probability $(1-\gamma/n)^n \to e^{-\gamma}$. But it is possible to identify the limit distribution explicitly.
Let $T^n_k$ be the moment when $k$th tail appears, and $X_n = \#\{k\le n: T_k^n\le n\}$ be the number of tails in the first $n$ flips. It has a binomial $(n,\gamma/n)$ distribution. It is well known that $X_n$ converges in distribution, as $n\to\infty$, to a random variable $X$ having Poisson distribution with parameter $\gamma$.
Further, it is easy to see that, given $X_n = k$, the set $\{T^n_1,T^n_2,\dots,T^n_k\}$ is uniformly distributed among $k$-subsets of $1,\dots,n$. Therefore, as $n\to\infty$, the distribution of $(T^n_1/n,T^n_2/n,\dots,T^n_k/n)$ given $X_n=k$ converges to the uniform distribution on $$\mathbb{S}_k = \{(x_1,\dots,x_k): 0\le x_1\le \dots\le x_k\le 1\}.$$ The lengths of head runs in first $n$ flips are just $T^n_i-T^n_{i-1}-1$, $i\ge 1$, and $n-T^n_{X_n}$. Therefore, given $X_n=k$, these lengths, divided by $n$, converge in distribution to $(S_1,\dots,S_{k+1}) = (\tau_1, \tau_2 - \tau_1,\tau_3-\tau_2,\dots,\tau_k - \tau_{k-1},1-\tau_k)$, where the vector $(\tau_1,\dots,\tau_k)$ is uniformly distributed over $\mathbb S_k$.
As a result, $L_n/n$ given $X_n=k$ converges in distribution, as $n\to\infty$, to
$$
M_k = \max\{S_1,\dots,S_{k+1}\},
$$
where the vector $(S_1,\dots,S_{k+1})$ is uniformly distributed over the simplex $\mathbb T_{k+1} = \{(x_1,\dots,x_{k+1}) : x_i\ge 0, x_1+\dots + x_{k+1} = 1\}$.
Hence we can find the distribution of $L$. It has atom in $1$ (corresponding to the case $X = 0$) of size $e^{-\gamma}$ and density on $[0,1]$ of the form
$$
f(x) = \sum_{k=1}^\infty \mathsf{P}(X = k) f_k (x) = \sum_{k=1}^\infty\frac{\gamma^k e^{-\gamma}}{k!} f_k(x),
$$
where $f_k(x)$ is the density of $M_k$. It is a matter of the inclusion-exclusion argument to show that
$$
f_k(x) = (k+1)k\sum_{i=0}^{\lfloor 1/x\rfloor-1} (-1)^{i}{k\choose i}(1-ix)^{k-1}.
$$
(Note that $f_k(x) = 0$ for $x<1/(k+1)$.)
Finally we get
$$
f(x) = \sum_{k=1}^\infty\frac{\gamma^k e^{-\gamma}(k+1)}{(k-1)!}\sum_{i=0}^{\lfloor 1/x\rfloor-1} (-1)^{i}{k\choose i}(1-ix)^{k-1}.
$$
It is possible to rearrange the sum and to write it somewhat shorter. I'll write the ultimate expression for the cdf $F(x) = \mathsf{P}(L<x)$ rather than that for the density, as it looks nicer and includes both continuous part and the jump part:
$$
F(x) = e^{-\gamma}\left(\sum_{i=0}^{\lfloor 1/x\rfloor} \frac{\big(\gamma(ix-1)\big)^i}{i!}e^{\gamma(1-ix)} - \sum_{i=0}^{\lfloor 1/x\rfloor - 1} \frac{\big(\gamma((i+1)x-1)\big)^i}{i!}e^{\gamma(1-(i+1)x)}\right),\quad x\ge 0.
$$
Here $\sum_{i=0}^{-1} = 0$ and $0^0 = 1$.