It is $O(\log n)$. Simply take a look at the number of iterations your loop produces when $n$ goes up. I know this is not a proof, btw. You can observe that if $n = 2^p$, for some $p \geqslant 0$, the number of iterations inside the loop increases by one. Clearly, this happens only $O(\log n)$ times, and therefore your loop has $O(\log n)$ iterations.
Now a lightly more formal attempt, but still with a few holes here and there.
We observe that at the end of the loop
$$k = 2^{\lfloor \log n \rfloor + 1} - 1.$$
Technically, we should prove this using a loop invariant or induction, but I'm a little too lazy for that now. If we let $n = 2^p$, for some integer $p \geqslant 1$, then we get that $\lfloor \log n \rfloor = \lfloor \log(n - 1) \rfloor + 1$. Let $k_n$ denote the value of $k$ after $n$ iterations. Then
$$k_n = 2^{p + 1} - 1 = 2^{\lfloor \log(n - 1) \rfloor + 2} - 1 > 2^{\lfloor \log(n - 1) \rfloor + 1} - 1 = k_{n - 1}.$$
So we conclude that $k_n$ has grown over $k_{n-1}$, but only if $n = 2^p$. Therefore, the number of iterations is $O(\log n)$.
Edit
A far more intuitive approach is to consider the code
int k = 0;
for (int i = 0; k < n; i++) {
if (i = 1) {
k = 1;
}
k *= 2;
}
Clearly, your code causes $k$ to converge more quickly to $n$ than this example, and this loop finishes after $O(\log n)$ iterations. Clearly, your code will also at most use $O(\log n)$ iterations.