I am readying in this website for a long time today similar questions and the answers provided about the issue I am asking now but I have to say I am more baffled than I was originally.
My original question would be:
I have read in many textbooks when they teach integration that they emphasize not to treat dx as a quantity but only as a notation that shows which is the variable of integrations. The same textbooks after a while they start talking about differentials and based on the following expressions
$$ df = f'(x)dx\quad df = f_{x}dx+f_{y}dy $$
they say that now that differentials have been introduced we can treat them as independent variables and use them freely as fractions.
Now I was always working with that idea, but after some digging I start seeing a lot of different answers, especially in here, and most of them were conflicting.
There were people mentioning Non-Standard Analysis and Differential Forms and were trying to justify that we can treat differentials as fractions. Also there were other people saying that every time we treat them as fractions we actually apply other theorems and so we don't actually treat them that way.
So the questions I have to ask now is:
Can we actually treat differentials as fractions in EVERY framework, or is there today a debate about it?
I don't want personal opinions or what we can do in a specific framework. I saw a lot of people mentioning that in single variable calculus you may treat them like that because you won't do mistakes. Can you actually treat them like that strictly mathematically in EVERY framework or you can just do it for ease but you actually apply different theorems behind the scenes?
If differentials can't be treated in EVERY framework as a fraction then is it defined and proven that they can be always treated as a fraction in some frameworks?
Given that we have defined differentials as
$$ df = f'(x)dx\quad df = f_{x}dx+f_{y}dy $$
then didn't we automatically defined than they can be used as independent quantities, so they can be seen as fractions also? Shouldn't we accept them as fractions from the moment of that definition existed?