For the derivative of the complex function, when it is analytic, then it satisfies C-R equation. So in this case we have $f'(z_{0})=u_{x}(x_{0},y_{0})+iv_{x}(x_{0},y_{0})$. But if we consider the Vector valued function $g:\mathbb{R^{2}}\rightarrow\mathbb{R^{2}}$, the derivative at $z_{0}=(x_{0},y_{0})$ is $g'(z_{0})$, which is a Jacobi matrix. What makes these two things different? Is because the binary operation "vector multiplication" makes different sense in complex plane and $\mathbb{R^2}$?
-
The difference : $\mathbb R^2\neq \mathbb C$. – Surb Jul 03 '20 at 05:44
-
1The requirement that $f$ is complex differentiable is much stronger that $f$ (as a function on $\mathbb{R}^2$) being real differentiable. – copper.hat Jul 03 '20 at 05:57
-
You might be interested in Visual Complex Analysis by Needham. The book devotes a chapter to the meaning of the complex derivative, which Needham calls the "amplitwist". – awkward Jul 03 '20 at 15:06
-
See related https://math.stackexchange.com/a/2990869/72031 – Paramanand Singh Jul 05 '20 at 02:27
2 Answers
When considered as a function from $\Bbb{R}^2$ to $\Bbb{R}^2$, a complex function differentiable at a point $z = x + iy$ must be differentiable at $(x, y)$, so in this sense, complex differentiability is stronger than $\Bbb{R}^2$-differentiability. Further, it's strictly stronger, as the Jacobean has to be of the form $$\begin{pmatrix} a & -b \\ b & a \end{pmatrix}.$$ In fact, this matrix corresponds to a complex derivative $a + ib$. The above matrix represents the matrix for the linear transformation of $z \mapsto (a + ib)z$, considered as maps from $\Bbb{C}$ to $\Bbb{C}$, with the ground field $\Bbb{R}$, over the basis $(1, i)$.
One consequence of this is that complex differentiable functions need to be conformal; the above matrix is a product of a scaling matrix $\sqrt{a^2 + b^2} I$ and a rotation matrix. This means that the map locally preserves angles, making it conformal. This is not true for general differentiable functions from $\Bbb{R}^2$ to $\Bbb{R}^2$!

- 869
-
Thanks a lot. So that means C-R equation has something related to a rotation matrix? – David Jul 03 '20 at 08:06
-
1Basically, yes. It has something to do with a scaling/rotation matrix combination. The Cauchy-Riemann equations are necessary and sufficient conditions for the Jacobean to take the above form (under the assumption the Jacobean exists). Matrices of these form are precisely products of scaling matrices and rotation matrices. – user804886 Jul 03 '20 at 08:11
-
This is essentially the perfect answer already, but I'd like to add: The $\mathbb R^2$-derivative is a linear map $\mathbb C\to\mathbb C$ where $\mathbb C$ is considered an $\mathbb R$ vector space. The complex derivative is a linear map $\mathbb C\to\mathbb C$, where $\mathbb C$ is considered a $\mathbb C$ vector space. And an $\mathbb R$-linear map is also $\mathbb C$-linear iff its matrix representation looks like the one above. – Vercassivelaunos Jul 03 '20 at 09:30
If we let $\phi:\mathbb{C} \to \mathbb{R}^2$ be $\phi(x+iy) = (x,y)^T$ then $g(x) = \phi(f(x_1+ix_2))$.
Then ${\partial g(x) \over \partial x_1} = \lim_{t \to 0} \phi({f(x_1+i x_2 +t)- f(x_1+ix_2) \over t} ) = \phi(f'(x_1+i x_2))$ and ${\partial g(x) \over \partial x_2} = \lim_{t \to 0} \phi({f(x_1+i x_2 +it)- f(x_1+ix_2) \over t} ) = \phi(if'(x_1+i x_2))$.
In particular, if we write ${\partial g(x) \over \partial x_1} = \begin{bmatrix} a & b \end{bmatrix}$ then we must have ${\partial g(x) \over \partial x_2} = \begin{bmatrix} -b & a \end{bmatrix}$.

- 172,524
-
Thanks, the function $f$ you write here means $f :\mathbb{C} \to \mathbb{C}$ right? Then it is obvious that $\phi^\prime $ represents the Jacobi Matrix. – David Jul 06 '20 at 16:34
-
@TornadoDavid: Yes, With $f,g$ I just followed the notation in your question. I don't understand when you mean by $\phi'$? The map $\phi$ is just the isomorphism between $\mathbb{C}$ and $\mathbb{R}^2$, but it is not differentiable. – copper.hat Jul 06 '20 at 16:45
-
The point of the above is related to the non differentiability of $\phi$. The idea was to highlight the constraint that $\mathbb{C}$ differentiability adds in the context of $\mathbb{R}^2$ differentiability. – copper.hat Jul 06 '20 at 16:48
-
Oh I was wrong, it is $g^\prime$, which is the derivative matrix. It is 2x2. – David Jul 06 '20 at 16:50
-
Yeah, the idea was to show that given ${\partial g(x) \over \partial x_1} $ then you automatically know ${\partial g(x) \over \partial x_2} $. – copper.hat Jul 06 '20 at 16:57