The problem isn't with the calculus. It's that you're subtly asking it to divide by zero.
Here's why. Remember that calculus is just a method for calculating the gradient of a curve at every point. Recall that, for a straight line, the answer is easy. The gradient is simply: $$\frac{\text{change in vertical height}}{\text{change in horizontal distance}}$$
Now suppose that $x$ represents a constant. Then there is no change in $x$. So the gradient must be equal to:$$\frac{\text{change in }x}{\text{change in horizontal distance}}=\frac{0}{\text{change in horizontal distance}} = 0$$
So far so good. But what happens when we try to take the derivative of $x$ with respect to itself? That's the same as asking how $x$ changes as $x$ changes, which is like labeling our graph so that both the vertical and horizontal axes represent $x$. So now it is not only the change in vertical height that equals 0, but also the change in horizontal height. Hence the gradient must equal 0/0, which is undefined.
More generally, the change in $x$ with respect to any constant c is undefined. Since the constant does not change, the change in c equals 0. Hence the tangent at each point of the curve equals: $$\frac{\text{change in } x}{\text{change in } c} \; \; \; =\; \; \; \frac{\text{change in } x}{0}\; \; \; = \; \; \;undefined$$ Because this is true of the tangent at each and every point of the curve, we can intuitively see how $\frac{dx}{dc}$ as a whole must be undefined.
The take-home rule is that though you can take the derivative of a constant, you cannot take the derivative with respect to a constant. As a corollary, you cannot take the derivative of a constant with respect to itself, which is what explains the problem case in the question above.
Thinking just in terms of linear gradients, without worrying about infinitesimal limits and such, you can see in intuitive terms how you get the mistaken equality 1 = 0.
Let's again imagine we are dealing with straight lines, so that we can let $\delta x$ represent the change in $x$ without having to imagine that we have taken anything to the limit. Then the mathematics that gets us the left hand side of the mistaken equality corresponds to the intuitive rule that, since the numerator and denominator of $\frac{\delta x}{\delta x}$ are the same, $\frac{\delta x}{\delta x}$ must equal 1.
The mathematics that gets us the right hand side of the mistaken equality corresponds to the intuitive rule that, since $x$ is a constant, $\delta x = 0$, hence the numerator of $\frac{\delta x}{\delta x}$ is 0 and so $\frac{\delta x}{\delta x}$ must itself equal 0.
The mistake, in both cases, is in failing to notice that when $x$ is constant $\frac{\delta x}{\delta x} = \frac{0}{0}$, which is an exception to both of these rules.