4

Suppose I have $f(x) = 5x$.

I know that $\frac{d\ f(x)}{dx} = 5$.

But what is $\frac{d f(x)}{d 5}$ , the derivative of the function $f$ with respect to the constant 5?

The reason I ask is that I'm implementing software that computes auto-differentiation (a la TensorFlow). I want to know if I can treat a constant like a variable (as above) or if I have to do something else. This Stanford deep learning class webpage is what I'm referring to:

$$ f(x) = c+x \\ f_a(x) = ax $$ Where the functions $f_c$, $f_a$ translate the input by a constant of $c$ and scale the input by a constant of $a$, respectively. These are technically special cases of addition and multiplication, but we introduce them as (new) unary gates here since we do not need the gradients for the constants $c$, $a$.

That above statement implies that you could compute the derivative w.r.t. a constant, but they chose not to.

This post did not answer my question: derivative with respect to constant.

Thanks.

4 Answers4

4

The derivative of a constant with respect to a variable is $0$, but the derivative of a function with respect to a constant, as Fra mentioned in the comments, is ill defined.

EDIT

The question has been updated. The link provided in the question discusses functions

$$f_c(x) = c + x $$ and $$f_a(x) = ax$$

The link also indicates their derivatives are:

$$\frac{df}{dx}=1$$ and $$\frac{df}{dx}=a$$ respectively, as expected.

These derivatives are still with respect to $x$, not constants $c$ or $a$. The confusion might have arisen since letter $c$ in $f_c(x)$ might have given the impression that this is a function with respect to $c$, which is not the case. Same argument applies for $f_a(x)$.

Josh
  • 1,086
  • 4
  • 15
  • $\frac{d\ 5x}{d5} = x$, right? – stackoverflowuser2010 Apr 19 '21 at 18:34
  • @stackoverflowuser2010 no. I believe you are treating this as a fraction, but derivative should be treated as a fraction. $5$ is a constant so $\frac{d (5x)}{d5}$ is not defined. You can talk about $\frac{d (5x)}{dx}$, which is equal to 5. Let me know if you have further questions. – Josh Apr 19 '21 at 21:41
2

It help to know about the Derivation operator that satisfies $$ D_t[u+v] = D_t[u]+D_t[v],\quad D_t[u\,v] = D_t[u]\,v + u\,D_t[v].\tag{1} $$ For example, use equation $(1)$ to get $$ D_t[m\,x+b] = D_t[m]\,x + m D_t[x] + D_t[b].\tag{2} $$ You asked

I want to know if I can treat a constant like a variable

In the context of symbolic differentiation the answer is yes with $\,D_t,\,$ and if you want to assume laterthat some symbol $\,c\,$ is a constant, you just set $\,D_t[c] = 0.\,$

Mathematica has a function Dt which implements the total differential.

Somos
  • 35,251
  • 3
  • 30
  • 76
1

A derivative is defined by a limit \begin{equation} \frac{df}{dx}(x_0)=\lim_{x\rightarrow x_0}\frac{f(x)-f(x_0)}{x-x_0} \end{equation} and that means how $f$ changes when $x$ changes if $x$ is constant, then $x$ never changes so we can interpret $f$ never changes but this has not sense. Anothe way (matemathecally) that limit does not exist because the limit at the denominator is $0$ $\forall x$.

Don P.
  • 313
0

In the context of computation graphs used for auto-differentiation (like in TensorFlow) for neural network back-propagation, constants (e.g. 5) can be treated like an input variable.

In the case of the original example,

$$ f(x) = 5x $$ Then $$ \frac{d f(x)}{d 5} = x $$

Note that in practice, while $\frac{d f(x)}{d 5}$ is computable, you would never need the result because the point of autodifferentiation for backpropagation is to update weight variables.