Loading...

f(z)=z^2

Complex Differentiation


The notion of the complex derivative is the basis of complex function theory. The definition of complex derivative is similar to the derivative of a real function. However, despite a superficial similarity, complex differentiation is a deeply different theory.

A complex function $f(z)$ is differentiable at a point $z_0\in \mathbb C$ if and only if the following limit difference quotient exists

\begin{eqnarray}\label{diff01} f'(z_0) = \lim_{z \rightarrow z_0} \frac{f(z)-f(z_0)}{z-z_0}. \end{eqnarray}

Alternatively, letting $\Delta z = z-z_0,$ we can write

\begin{eqnarray}\label{diff02} f'(z_0) = \lim_{\Delta z \rightarrow 0} \frac{f(z_0+\Delta z)-f(z_0)}{\Delta z}. \end{eqnarray}

We often drop the subscript on $z_0$ and introduce the number \[\Delta w = f(z+\Delta z)-f(z).\] which denotes the change in the value $w=f(z)$ corresponding to a change $\Delta z$ in the point at which $f$ is evaluated. Then we can write equation (\ref{diff02}) as \[\frac{d w}{d z}= \lim_{\Delta z \rightarrow 0}\frac{\Delta w}{\Delta z}.\]

Despite the fact that the formula (\ref{diff01}) for a derivative is identical in form to that of the derivative of a real-valued function, a significant point to note is that $f'(z_0)$ follows from a two-dimensional limit. Thus for $f'(z_0)$ to exist, the relevant limit must exist independent of the direction from which $z$ approaches the limit point $z_0.$ For a function of one real variable we only have two directions, that is, $x\lt x_0$ and $x\gt x_0.$


derivative
There are an infinite variety of directions to approach $z_0.$

A remarkable feature of complex differentiation is that the existence of one complex derivative automatically implies the existence of infinitely many! This is in contrast to the case of the function of real variable $g(x),$ in which $g'(x)$ can exist without the existence of $g''(x).$


Cauchy-Riemann equations

Now let's see a remarkable consequence of definition (\ref{diff01}). First we will see what happens when we approach $z_0$ along the two simplest directions - horizontal and vertical. If we set $$z= z_0 + h = (x_0+h)+iy_0,\quad h\in \mathbb R,$$ then $z \rightarrow z_0$ along a horizontal line as $h\rightarrow 0.$ If we write $f$ in terms of its real and imaginary components, that is $$f(z) = u(x,y)+iv(x,y),$$ then $$f'(z_0)= \lim_{h \rightarrow 0}\frac{f(z_0+h)-f(z_0)}{h}.$$ Thus

\begin{eqnarray*} f'(z_0)&= & \lim_{h \rightarrow 0} \frac{f(z_0+h)-f(z_0)}{h} = \lim_{h \rightarrow 0} \frac{f(x_0+h + iy_0)-f(x_0+iy_0)}{h} \\ &= & \lim_{h \rightarrow 0} \left[ \frac{u \left( x_0 +h, y_0 \right) - u \left( x_0 , y_0 \right)}{h}\right]+i \lim_{h \rightarrow 0} \left[ \frac{v \left( x_0 +h, y_0 \right) - v \left( x_0 , y_0 \right)}{h}\right] \\ &= & u_x(x_0, y_0)+ i v_x(x_0,y_0) \end{eqnarray*}
where $u_x(x_0, y_0)$ and $v_x(x_0,y_0)$ denote the first-order partial derivatives with respect to $x$ of the function $u$ and $v,$ respectively, at $(x_0, y_0).$ If now we set $$z = z_0+ik = x_0 + i(y_0+k), \quad k\in \mathbb R,$$ then $z\rightarrow 0$ along a vertical line as $k\rightarrow 0.$ Therefore, we also have
\begin{eqnarray*} f'(z_0)&= & \lim_{k \rightarrow 0} \frac{f(z_0+ik)-f(z_0)}{ik} = \lim_{k \rightarrow 0} \left[ -i \frac{f(x_0 + i(y_0+k))-f(x_0+iy_0)}{k} \right] \\ &= & \lim_{k \rightarrow 0} \left[ \frac{v \left( x_0 , y_0 + k\right) - v \left( x_0 , y_0 \right)}{k}-i \frac{u \left( x_0 , y_0 +k \right) - u \left( x_0 , y_0 \right)}{k}\right] \\ &= & v_y(x_0, y_0)- i u_y(x_0,y_0) \end{eqnarray*}
where the partial derivatives of $u$ and $v$ are, this time, with respect to $y.$ By equating the real and imaginary parts of these two formulae for the complex derivative $f'(z_0),$ we notice that the real and imaginary components of $f(z)$ must satisfy a homogeneous linear system of partial differential equations: $$u_x=v_y, \quad u_y=-v_x.$$ These are the Cauchy-Riemann equations named after the famous nineteenth century mathematicians Augustin-Louis Cauchy and Bernhard Riemann, two of the founders of modern complex analysis.

A complex function $f(z)=u(x,y)+iv(x,y)$ has a complex derivative $f'(z)$ if and only if its real and imaginary part are continuously differentiable and satisfy the Cauchy-Riemann equations \begin{eqnarray*} u_x=v_y, \quad u_y=-v_x \end{eqnarray*} In this case, the complex derivative of $f(z)$ is equal to any of the following expressions: $$f'(z)=u_x+iv_x = v_y - i u_y.$$

Example 1: Consider the function $f(z)=z^2,$ which can be written as $$z^2 = \left(x^2-y^2\right)+ i \left(2xy\right).$$ Its real part $u = x^2-y^2$ and imaginary part $v=2xy$ satisfy the Cauchy-Riemann equations, since $$u_x=2x = v_y, \quad u_y = -2y = -v_x.$$ Theorem 1 implies that $f(z)=z^2$ is differentiable. Its derivative turns out to be

$$f'(z)=u_x+iv_x = v_y - i u_y = 2x + i 2y = 2(x+iy) = 2z.$$

Fortunately, the complex derivative has all of the usual rules that we have learned in real-variable calculus. For example,

$$\frac{d}{dz}z^n = nz^{n-1}, \quad \frac{d}{dz} e^{cz} = ce^{cz}, \quad \frac{d}{dz} \log z = \frac{1}{z},$$

and so on. In this case, the power $n$ can be a real number (or even complex in view of the identity $z^n = e^{n \log z}$), while $c$ is any complex constant. The exponential formulae for the complex trigonometric and hyperbolic functions implies that they also satisfy the standard rules

\begin{eqnarray*} \frac{d}{dz}\sin z &=& \cos z, \quad \frac{d}{dz} \cos z = -\sin z.\\ \frac{d}{dz}\sinh z &=& \cosh z, \quad \frac{d}{dz} \cosh z = \sinh z. \end{eqnarray*}

If the derivatives of $f$ and $g$ exist at a point $z,$ then \begin{eqnarray*} \frac{d}{dz}\left[f(z)+g(z) \right]=f^{\prime}(z)+g^{\prime}(z)\\ \frac{d}{dz}\left[f(z)g(z) \right]=f(z)g^{\prime}(z)+f^{\prime}(z)g(z) \end{eqnarray*} and, when $g(z)\neq 0,$ \begin{eqnarray*} \frac{d}{dz}\left[\frac{f(z)}{g(z)} \right]=\frac{g(z)f^{\prime}(z)-f(z)g^{\prime}(z)}{\left[g(z)\right]^2} \end{eqnarray*} Finally, suppose that $f$ has a derivative at $z_0$ and that $g$ has a derivative at the point $f (z_0).$ Then the function $F(z) = g\left(f (z)\right)$ has a derivative at $z_0,$ and \begin{eqnarray*} F^{\prime}(z)=g^{\prime}\left(f(z_0)\right)f^{\prime}(z_0) \end{eqnarray*} Note that the formulae for differentiating sums, products, ratios, inverses, and compositions of complex functions are all identical to their real counterparts, with similar proofs. This means that you don't need to learn any new rules for performing complex differentiation!


Sufficient conditions for differentiability

Satisfaction of the Cauchy-Riemann equations at a point $z_0 = (x_0, y_0)$ is not sufficient to ensure the existence of the derivative of a function $f(z)$ at that point. However, by adding continuity conditions to the partial derivatives, we have the following useful theorem.

Let the function $f(z) = u(x, y) + iv(x, y)$ be defined throughout some $\varepsilon$ neighbourhood of a point $z_0 = x_0 + iy_0,$ and suppose that (1) first-order partial derivatives of the functions $u$ and $v$ with respect to $x$ and $y$ exist everywhere in the neighbourhood; (2) those partial derivatives are continuous at $(x_0, y_0)$ and satisfy the Cauchy- Riemann equations $$u_x = v_y,\quad u_y =-v_x$$ at $(x_0, y_0).$ Then $f^{\prime}(z_0)$ exists, its value being $$f^{\prime}(z_0) = u_x + iv_x$$ where the right-hand side is to be evaluated at $(x_0, y_0).$

Example 2: Consider the exponential function $$f (z) = e^z = e^xe^{iy} \quad \quad (z = x + iy),$$ In view of Euler's formula, this function can be written $$f(z) = e^x \cos y + ie^x \sin y,$$ where $y$ is to be taken in radians when $\cos y$ and $\sin y$ are evaluated. Then

$$u(x, y) = e^x \cos y\quad \text{and} \quad v(x, y) = e^x \sin y.$$

Since $u_x = v_y$ and $u_y = -v_x$ everywhere and since these derivatives are everywhere continuous, the conditions in the above theorem are satisfied at all points in the complex plane. Thus $f^{\prime}(z)$ exists everywhere, and

$$f^{\prime}(z) = u_x + iv_x = e^x \cos y + ie^x \sin y.$$

Note that $f^{\prime}(z) = f (z)$ for all $z.$

A consequence of the Cauchy-Riemann conditions is that the level curves of $u,$ that is, the curves $u(x,y)=c_1$ for a real constant $c_1,$ are orthogonal to the level curves of $v,$ where $v(x,y)=c_2,$ at all points where $f^{\prime}$ exists and is nonzero. From Theorem 2 we have

\begin{eqnarray*} \bigg|f^{\prime}(z)\bigg|^2 &= &\left(\frac{\partial u}{\partial x}\right)^2+\left(\frac{\partial v}{\partial x}\right)^2\\ &=& \left(\frac{\partial u}{\partial x}\right)^2+\left(\frac{\partial u}{\partial y}\right)^2\\ &=& \left(\frac{\partial v}{\partial x}\right)^2+\left(\frac{\partial v}{\partial y}\right)^2 \end{eqnarray*}
hence the two-dimensional gradients
\begin{align*} \nabla u &= \left(\frac{\partial u}{\partial x},\frac{\partial u}{\partial y}\right) \quad \text{and}\quad \nabla v = \left(\frac{\partial v}{\partial x},\frac{\partial v}{\partial y}\right) \end{align*}
are nonzero. We know from vector calculus that the gradient is orthogonal to its level curve (i. e., $du=\nabla u\cdot \mathbf{ds}=0,$ where $\mathbf{ds}$ point in the direction of the tangent to the level curve), and from the Cauchy-Riemann condition (Theorem 2) we see that the gradients $\nabla u$ and $\nabla v$ are orthogonal because their vector dot product vanishes:
\begin{eqnarray*} \nabla u\cdot \nabla v&=& \frac{\partial u}{\partial x}\frac{\partial v}{\partial x}+\frac{\partial u}{\partial y}\frac{\partial v}{\partial y}\\ &=&-\frac{\partial u}{\partial x}\frac{\partial u}{\partial y}+\frac{\partial u}{\partial x}\frac{\partial u}{\partial x}=0 \end{eqnarray*}

Consequently, the two-dimensional level curves $u(x,y)=c_1$ and $v(x,y)=c_2$ are orthogonal.

Example 3: For the function $f(z) = z^2,$ the level curves $u(x, y) = c_1$ and $v(x, y) = c_2$ of the component functions are the hyperbolas indicated in Figure 2. Note the orthogonality of the two families. Observe also that the curves $u(x, y) = 0$ and $v(x, y) = 0$ intersect at the origin but are not, however, orthogonal to each other.


Orthogonal level curves
Orthogonal level curves of the real and imaginary components of $f(z) = z^2.$

Analytic functions

Let $f:A\rightarrow \mathbb C$ where $A\subset \mathbb C$ is an open set. The function is said to be analytic on $A$ if $f$ is differentiable at each $z_0\in A.$ The word "holomorphic", which is sometimes used, is synonymous with the word "analytic". The phrase "analytic at $z_0$" means $f$ is analytic on a neighborhood of $z_0.$

An entire function is a function that is analytic at each point in the entire finite plane. Since the derivative of a polynomial exists everywhere, it follows that every polynomial is an entire function.

If a function $f$ fails to be analytic at a point $z_0$ but is analytic at some point in every neighbourhood of $z_0,$ then $z_0$ is called a singular point or singularity, of $f.$

Example 4: The function $$f(z) = \frac{1}{z}$$ is analytic at each nonzero point in the finite plane. On the other hand, the function $$f(z) = |z|^2$$ is not analytic at any point since its derivative exists only at $z = 0$ and not throughout any neighbourhood.

The point $z = 0$ is evidently a singular point of the function $f(z) = 1/z.$ The function $f(z) = |z|^2,$ on the other hand, has no singular points since it is nowhere analytic.

If two functions are analytic in a domain $D,$ their sum and their product are both analytic in $D.$ Similarly, their quotient is analytic in $D$ provided the function in the denominator does not vanish at any point in $D.$ In particular, the quotient $$\frac{P(z)}{Q(z)}$$ of two polynomials is analytic in any domain throughout which $Q(z)\neq 0.$

Furthermore, from the chain rule for the derivative of a composite function, it implies that a composition of two analytic functions is analytic.

Example 5: The function $$f(z) = \frac{4z+1}{z^3-z},$$ is analytic throughout the $z$ plane except for the singular points $z=0$ and $z=1,-1.$ The analyticity is due to the existence of familiar differentiation formulas, which need to be applied only if the expression for $f^{\prime}(z)$ is wanted. In this case, we have $$f^{\prime}(z)=\frac{-8z^3-3z^2+1}{z^2(z^2-1)^2}.$$

When a function is given in terms of its component functions $$u(x, y)\quad \text{and}\quad v(x, y),$$ its analyticity can be demonstrated by direct application of the Cauchy-Riemann equations.

Example 6: The function $$f(z)=e^ye^{ix}=e^y\cos x +ie^y\sin x$$ is nowhere analytic. The component functions are $$u(x,y)=e^y\cos x\quad\text{and}\quad v(x,y)=e^y\sin x.$$ If $f(z)$ were analytic, then (using Cauchy-Riemann equations)

$$u_x=v_y \Rightarrow -e^y\sin x=e^y\sin x \Rightarrow 2e^y\sin x=0 \Rightarrow \sin x=0$$
and
$$u_y=-v_x \Rightarrow e^y\cos x=-e^y\cos x \Rightarrow 2e^y\cos x=0 \Rightarrow \cos x=0.$$
On the one hand, we have that the roots of $\sin x$ are $n\pi$ (with $n\in \mathbb Z$) but $\cos (n\pi)=(-1)^n\neq0.$ On the other hand, the roots of $\cos x$ are $(2n-1)\pi/2$ but
$$\sin ((2n-1)\pi/2)=-\cos(n\pi)=-(-1)^n\neq 0.$$
Consequently, the Cauchy-Riemann equations are not satisfied anywhere.

The Logarithmic Function