As David K mentions, there are infinitely many rotations that map $\bf a$ to $\bf b$: Given any such rotation $R$, for a rotation $S$ about $\bf b$ by an arbitrary angle, $S \circ R$ is also a rotation that maps $\bf a$ to $\bf b$, and so there is a circle's worth of such rotations.
If the pair $({\bf a}, {\bf b})$ is linearly independent, however, then we can pick out a preferred rotation, namely the one that fixes ${\bf a} \times {\bf b}$ (equivalently, the unique rotation that preserves the plane spanned by $\bf a$ and $\bf b$ as well as the orientation of that plane).
Here's one way to construct the matrix corresponding to this preferred rotation: First, notice that we may as well normalize $\bf a$ and $\bf b$, that is, replace them respectively by the unit vectors $\frac{\bf a}{|{\bf a}|}$ and $\frac{\bf b}{|{\bf b}|}$.
The vector
$${\bf n} := \frac{{\bf a} \times {\bf b}}{|{\bf a} \times {\bf b}|}$$ has unit length and is orthogonal to $\bf a$, and hence ${\bf a}$ and ${\bf b}$ together determine an oriented, orthonormal basis of $\Bbb R^3$, namely,
$$({\bf a}, {\bf n} \times {\bf a}, {\bf n}).$$
In particular, the matrix
$$\begin{pmatrix}{\bf a} & {\bf n} \times {\bf a} & {\bf n}\end{pmatrix}$$
defines a rotation, namely the one that sends the standard basis $({\bf e}_1, {\bf e}_2, {\bf e}_3)$ to the above basis, and its inverse does the reverse.
Now, by symmetry,
$$({\bf b}, {\bf n} \times {\bf b}, {\bf n}).$$
is also an oriented, orthonormal basis, and the corresponding matrix built by adjoining these (column) vectors sends the standard basis to this one.
Putting this together with the inverse mentioned above maps $({\bf a}, {\bf n} \times {\bf a}, {\bf n})$ to the standard basis $({\bf e}_1, {\bf e}_2, {\bf e}_3)$ and then the standard basis to $({\bf b}, {\bf n} \times {\bf b}, {\bf n})$, that is, by construction
$$\begin{pmatrix}{\bf b} & {\bf n} \times {\bf b} & {\bf n}\end{pmatrix}\begin{pmatrix}{\bf a} & {\bf n} \times {\bf a} & {\bf n}\end{pmatrix}^{-1}$$
is a rotation matrix that maps $\bf a$ to $\bf b$ and fixes $\bf n$ (and hence ${\bf a} \times {\bf b}$). (Since the inverted matrix $\begin{pmatrix}{\bf a} & {\bf n} \times {\bf a} & {\bf n}\end{pmatrix}$ is orthogonal, its inverse is just its transpose, saving considerable computation.)
One can of course expand this expression to find explicit formulas for the entries of the resulting matrix in terms of the components of $\bf a$ and $\bf b$, but I doubt that this would simplify nicely.