1

Let $$ f(x_1,x_2) = \frac{1}{\sqrt{x_1^2+x_2^2+1}} \left(x_1,x_2,1 \right) $$

For $i = 1,2$ we have

$$ \partial_{x_i} f = \frac{1}{\sqrt{x_1^2+x_2^2+1}}\left(\partial_{x_i}x_1,\partial_{x_i}x_2,0 \right) +\frac{-x_i}{\left(x_1^2+x_2^2+1\right)^{3/2}}(x_1,x_2,1) = \frac{1}{\left(x_1^2+x_2^2+1\right)^{3/2}} \left[\left(x_1^2+x_2^2+1\right)\left(\partial_{x_i}x_1,\partial_{x_i}x_2,0\right) - x_i(x_1,x_2,1) \right] $$

The gradient of such function, after some computation I get the following matrix

$$ \nabla f = \frac{1}{\left(x_1^2+x_2^2+1\right)^{3/2}} \begin{pmatrix} x_2^2+1 & - x_1x_2 & -x_1 \\ -x_1x_2 & x_1^2+1 & -x_2 \end{pmatrix} $$

Is it correct?

user8469759
  • 5,285

1 Answers1

1

The calculus of the partial derivates are correct. But the definition that I know of the matrix gradient is the transpose matrix that you wrote.

yemino
  • 814
  • I think that depends from how you consider vectors, right? (If rows or columns) – user8469759 Feb 28 '18 at 15:55
  • In definition that I know, the gradient of a vector function $f$ is a matrix which rows are the gradient of the components of $f$, so the gradient is the same matrix if $f$ is a row or a column vector. But you must to revise the definition that you are using. – yemino Feb 28 '18 at 16:02
  • Actually I'm not using any definition. I'll check later my analysis book. – user8469759 Feb 28 '18 at 16:04
  • 1
    @yemino I saw sometimes multivariable calculus written as $xA$ instead of $Ax$, for some jacobian matrix $A$. It is rare (as rare as call "gradient" to the jacobian matrix :p) but exists. Im the same guy of the rincon. – Masacroso Feb 28 '18 at 16:16