In a calculus textbook I have (Calculus, Stewart), it states that for a differentiable function $y=f(x)$, the differential of the function is defined as $$dy=f'(x) dx.$$
It states that $\Delta x\approx dx$ since $\Delta x$ is small. What I fail to understand is that why $dx$ is classified as a differential, and why in the first place $\Delta x$ is replaced with $dx$ when defining $dy$. The book states that $dy$ represents the change in the linearization of the function, and defines $\Delta y$, given that $$\Delta y=f(x+\Delta x)-f(x)$$ as the change in the value of the function, yet it doesn't state why they replaced $\Delta x$ with $dx$ when defining $dy$. My initial thoughts were that $dx$ must be an infinitesimal, yet the highly voted answer in this question:Is $\frac{\textrm{d}y}{\textrm{d}x}$ not a ratio? states that infinitesimals are not logical in standard analysis. Basically, I am confused as to why $\Delta x$ and $dx$ are treated as different concepts, and to me to state that $dx=\Delta x$ simply because $\Delta x$ is small does not seem very mathematically rigorous.