Let $X(n)$ be a sample of size $n$ from a continuous distribution $P(x)$ with median $m$.
The empirical distribution of $X(n)$ is $F_n$ (monotonic increasing piecewise constant).
From Glivenko-Cantelli theorem $F_n \to F$ uniformly, which implies that the median of the $F_n$ will converge to $m$ almost surely.
Proof
The Glivenko-Cantelli theorem states that $\sup_{x \in \mathbb{R}}\left| F(x) - F_n(x)\right| \xrightarrow{a.s.} 0$. We know that $0 \leq \left| F(m) - F_n(m)\right| \leq \sup_{x \in \mathbb{R}}\left| F(x) - F_n(x)\right|$ by definition of the supremum and absolute value. Therefore,
$$ 0 \leq \left| F(m) - F_n(m)\right| = \left| \frac12 - F_n(m)\right| \leq \sup_{x \in \mathbb{R}}\left| F(x) - F_n(x)\right| \xrightarrow{a.s.} 0$$
$$\implies \left| \frac12 - F_n(m)\right| \xrightarrow{a.s.} 0$$
Notice we made no distributional assumptions about $P$ except continuity.
Note: Need to add a little more nuance in the cases where $P$ has atoms (jumps) in it's CDF.
In the case we can define the median of discrete distribution $P$ as the value(s) where the CDF $F(x)$ intersects the line $F=\frac12$. This means that we can define $m$ as follows:
$$m(P):= \arg \min_{x} F(x)\geq \frac12$$
This definition conveniently applies to both discrete, mixed, and continuous distributions.
Due to the monotonicity of the $F(x)$, we will have a unique value for $m$ except in the case that @kimchilover points out (we have an interval of values where $F(x)=\frac12$.
In that case, the sample median will converge in probability to a set:
Let $m_i(P)$ be the sample median of size $i$ drawn from $P$ with CDF $F$. Let $M$ be the set
$$M:=\left\{ x: F(x) \leq \frac12\right\} \cap \left\{ x:F(x) \geq \frac12 \right\}$$
Then the sequence of sample medians achieves the following limit:
$$\lim_{n \to \infty} P\left( m_i \in \{\inf M-\epsilon\} \cup \{\sup M+\epsilon\}\right) = 1\;\forall \epsilon > 0$$
In other words, the infinite sequence of $m_i(P)$ will fall in the above set for all but a finite number of times. This is because the order statistics of the samples will "cluster" around each endpoint of the interval, so that you often "flip flop" between different endpoints as the next sample values falls to the left or right of the interval.
The above generalizes the earlier definitions of median:
Note that when the median is unique (e.g., a continuous distribution or we have a "jump" in the CDF from $<\frac12$ to $>\frac12$ where the location of the jump defines a unique median) then the above reduces to saying the sample median converges in probability to the unique median. This is weaker than almost sure convergence from Glivenko-Cantelli, but for all practical purposes is just as good.