My task is to prove that if an atomic measure space is $\sigma$-finite, then the set of atoms must be countable.
This is my given definition of an atomic measure space:
Assume $(X,\mathcal{M},\mu)$ is a measure space with all single points being measurable. An atom is a point $x$ with $\mu(\{x\}) > 0$. Letting $\mathcal{A}$ be the set of atoms, $(X,\mathcal{M},\mu)$ is called atomic if $\mathcal{A}\in\mathcal{M}$ and $\mu(\mathcal{A^c}) = 0$.
I didn't know how to prove this at first, so I looked it up on stack exchange and found this answer: (I do not have enough reputation to comment on the original post)
Here's how to prove your claim, with the appropriate assumption. Let $S\subset X$ be the set of atoms for some measure $\mu$ on $X$. Let $\{U_i\}$ be a countable measurable partition of $X$. Then if $S$ is uncountable, some $U_i$ contains an uncountable subset $S'$ of $S$, and $\mu(U_i)\geq \sum_{x\in S'}\mu(x)=\infty$ since any uncountable sum of positive numbers diverges. Thus $\mu$ is not $\sigma$-finite.
My question is why do we have that $\mu(U_i) \geq \sum_{x\in S'} \mu(x)$ ? I am assuming that this inequality comes from subadditivity of $\mu$ but as I have understood it subadditivity is defined for countable unions, not for uncountable unions so I am confused as to how we arrive at an uncountable sum in this step.