I don't understand the fetish for calculus. There should be a way without it (in fact, personally I prefer avoiding it if necessary):
Order the data as follows: $x_1 < x_2 \ldots <x_{n}$. Let's suppose that $x \in [x_i,x_{i+1}]$. Then we have the following the $$\|(x_1, \ldots x_n) - (x,x\ldots x) \|_1 = \sum_{j \leq i}(x-x_j) + \sum_{j>i} {(x_j-x)}$$
which is a linear function (when restricted to the interval $[x_i,x_{i+1}]$) with gradient equal to the number of $j$ such that $j \leq i$ minus the the number of $j$ such that $j >i$. Thus the gradient is zero if it is less than a median, it is positive if greater than a medium, and zero precisely when there are as many data points to the left and right. In other words, $x$ decreases as we approach a medium from below, stays constant on the set of mediums, and then decreases afterwards. So minimums occurs exactly at mediums.
Note that this is a piecewise linear function (depends on the interval $x$ is in) so calculus is no good.