Your reasoning goes awry when you say "I believe there will be $\frac{n(n+1)}{2}$ terms." (actually even before, as we will see.) You only have $n$ terms, and cannot take the number "outside" the sum.
The $i$th term is $i\log i$, it is not $\log i$ or anything else. At that point, there is no "simplifying and taking outside the sum" to do — you have to deal with the summands as they are.
Now, for the original question, and more on your approximation: you would have to justify that. Namely, it is very handwavy, and it's better to make it precise.
You have
$$
\sum_{i=1}^n \ln(i!)
= \sum_{i=1}^n \sum_{j=1}^i \ln j
= \sum_{j=1}^n \sum_{i=j}^n \ln j
= \sum_{j=1}^n (n-j+1) \ln j
= (n+1)\sum_{j=1}^n \ln j - \sum_{j=1}^n j\ln j \tag{1}
$$
which is not a first glance the same thing...$^{(\dagger)}$ Let's deal with each term separately.
As we know (see for instance this other question that
$
\sum\limits_{j=1}^n \ln j = n \ln n + O(n)
$, the first term is
$$
(n+1)\sum_{j=1}^n \ln j = n^2\ln n + O(n). \tag{2}
$$
By a comparison series/integral (since $f$ defined by $f(x) = x\ln x$ is monotone, and nice; see again this other question for more details on the method if you are not familiar with it), it is not hard to show that
$$
\sum_{j=1}^n j \ln j = \int_1^n x\ln x dx + O(n) = \frac{1}{2}n^2 \ln n + O(n^2). \tag{3}
$$
Combining (2) and (3) into (1), we get
$$
\sum_{i=1}^n \ln(i!)
= n^2\ln n - \frac{1}{2}n^2 \ln n + O(n^2) = \frac{1}{2}n^2 \ln n + O(n^2).
$$
$(\dagger)$ Actually, your original approximation is true, because of the theorem below. However, you may want to cite, or justify it before making such approximations — if you do not know how or why they hold, there is a very high probability you'll end up making a mistake very often.
Theorem. Let $(a_n)_n, (b_n)_n$ be two non-negative sequences such that $a_n\sim_{n\to\infty} b_n$. Then the series $\sum\limits_n a_n$ converges iff the series $\sum\limits_n b_n$ converges; moreover:
- If the series converge, then $\sum\limits_{n=N}^\infty a_n \sim_{N\to\infty} \sum\limits_{n=N}^\infty b_n$ (the remainders are equivalent);
- If the series diverge, then $\sum\limits_{n=1}^N a_n \sim_{N\to\infty} \sum\limits_{n=1}^N b_n$ (the partial sums are equivalent).