Using the two comments in the question, one can see the following two estimates:
\begin{align*}
\|A\|_2&≤\|A\|_{Fr}≤\sqrt{n}\|A\|_2
\end{align*}
The right inequality is obvious, the left follows from:
Let $λ$ be any eigenvalue with corresponding normalized eigenvector $w$, then it is
$$|λ|=|λ|\|w\|=\|Aw\|≤¹\|A\|\|w\|=\|A\|,$$
for any ¹submultiplicative norm $\|\cdot\|$.
For $n=1$ you get equality on the left.
And for $A=\text{diag}(λ)$, with $λ>0$, we get equality on the right, since
\begin{align*}
\|A\|_2 &= λ \\
\|A\|_{Fr}&=\sqrt{\sum_{i=1}^nλ^2} = \sqrt{n}λ=\sqrt{n}\|A\|_2
\end{align*}
Therefore the estimate above is the best we can deduce.
Remark: Since that question was asked with the "numerical linear algebra" tag, one should keep in mind, that eigenvalue problems (in general) are numerically unstable. That means if the entries of the matrix are perturbed little, the eigenvalues might still see a huge change. Since the Frobenius-norm is somewhat a measure for the perturbation of the entries, that instability is of interest for this question.
We can see the effect of the perturbation with the following matrix
$$\tilde{A}=\pmatrix{0&ε \\ 1 & 0}∈ℝ^{2×2}$$
Here the eigenvalues are $0$ for $ε=0$. For $ε\neq0$ they are $λ=\pm\sqrt{ε}$. So a small error $10^{-2}$ results in eigenvalues $10^{-1}$ - that is a factor of 10.
Now take the $n×n$-matrix that has $1$ on the left off-diagonal, and $ε$ in the top right corner:
$$\tilde{A}=\pmatrix{0&&&ε \\ 1 & 0 \\ &\ddots & \ddots \\ &&1&0}∈ℝ^{n×n}$$
Again all eigenvalues $λ=0$, if $ε=0$. But if $ε=10^{-n}$, the result contains one eigenvalue $λ=10^{-1}$ and you get an error amplification factor of $10^{n-1}$.
And, as I stated above, the Frobenius norm only sees the change $10^{-n}$, while the $2$-norm sees the change of the eigenvalue.
In general the following theorem holds:
Let $A∈ℝ^{n×n}$ be a matrix with eigenvectors $w_i$. And let $\tilde{A}=A+δA$ be a perturbed matrix. Then it holds, with $W=(w_1,…,w_n)$:
$$|λ(A)-λ(\tilde{A})|≤\text{cond}_{2}(W)\|δA\|.$$
So the conditioning of the eigenvalue problem depends on the condition number of $W$.
Sounds pretty bad, but for hermitian matrices there exists a orthonormal basis, which means that for these matrices the eigenvalue problem is stable. For arbitrary matrices it is arbitrary ill-conditioned.