Interestingly enough, the answer is "both, kinda".
When Leibniz invented the notation, he had in mind that $dx$ represented in a change in $x$ so small that it was insignificant compared to even the smallest real numbers. It is called an infinitesimal number. In his notion, $a\cdot dx$ was also infinitesimal for any real number $a$. In fact, there are only two things you can do with infinitesimals to make real numbers out of them. You can take the ratio of two of them (like $\frac{dy}{dx}$), and you can add an infinite number of them (like $\int x\cdot dx$). In that time, doing algebra with differentials was allowed.
Flash forward to the nineteenth century, when the real numbers were being formally defined and calculus was becoming real analysis. It was decided at that point that a belief or non-belief in the existence of infinitesimals shouldn't impact your ability to do calculus, so all of the definitions became based on limits and were strictly proved. But we were used to Leibniz's notation, so we stuck with it. The downside of that is that we can't treat differentials as algebraic terms anymore, and if we want to use something like the Chain Rule, we have to prove it based on the limit-centric definitions instead of simply cancelling out differentials. In that sense, now you are right that $\frac{d}{dx}$ is an operator.
That being said, there are a lot of people who "hack" calculus by leaning back on Leibniz's notions, and to be honest I don't know of any actual times when it gets you into trouble. Note that, as the links in my comments suggest, partial differentials don't behave well.