4

Find the value of $B = A^4- A^3 + A^2 - A$ where $A$ is the matrix given below $$ A= \left [ \begin{matrix} \cos\alpha & \sin \alpha \\ -\sin\alpha & \cos\alpha \end{matrix} \right ] $$

It's actually quite simple when you look at it. Find $A$, $A^2$, $A^3$, $A^4$ by doing some matrix multiplication and then add and subtract following the question.
I did all of the tedious work mentioned above.

But the matrix $B$ I obtain is filled with garbage. I assuming either we must use some trigno mumbo jumbo identities, or there's something beautiful that I missed from the question.
Edit: Could the sum of a geometric series help?

Lorenzo B.
  • 2,252
  • 2
    Should the upper right entry be $-\sin \alpha$? This would make $A$ a rotation matrix (https://en.wikipedia.org/wiki/Rotation_matrix) which seems like it would have some nice properties. – Michael Lugo Apr 16 '18 at 19:53
  • @MichaelLugo Typo you got me – SmarthBansal Apr 16 '18 at 19:54
  • You should know that a rotation matrix like what you have satisfies the nice property that $A^n =\left[\begin{smallmatrix} \cos(n\alpha)&\sin(n\alpha)\-\sin(n\alpha)&\cos(n\alpha)\end{smallmatrix}\right]$. Beyond that, I'm not sure that very much "nice" simplification can still be made. – JMoravitz Apr 16 '18 at 20:00

5 Answers5

2

Some hints...you could do some simplification by considering $$(A+I)B=A^5-A=A(A^4-I)$$

$R(\theta)$ is the standard rotation matrix $\left(\begin{matrix}\cos\theta &-\sin\theta\\ \sin\theta&\cos\theta\end{matrix}\right)$ which rotates by angle $\theta$ anticlockwise about the origin. Note that $R(\theta)R(\phi)=R(\theta+\phi)$.

Therefore $A=R(-\alpha)$ represents clockwise rotation by angle $\alpha$,$$A+I=\left(\begin{matrix}\cos\alpha+1 &\sin\alpha\\ -\sin\alpha&\cos\alpha+1\end{matrix}\right)=2\cos\frac{\alpha}{2}R(-\frac{\alpha}{2})$$ $$\implies(A+I)^{-1}=\frac{1}{2\cos\frac{\alpha}{2}}R(\frac{\alpha}{2})$$ Then $$B=(A+I)^{-1}A(A^4-I)$$

David Quinn
  • 34,121
1

For a rotation matrix $$ A= \left [ \begin{matrix} \cos\alpha & \sin \alpha \\ -\sin\alpha & \cos\alpha \end{matrix} \right ] $$ $A^2$ is just a rotation by $2\alpha$, $A^3$ is the rotation by $3\alpha$ and so on. $$ B= \left [ \begin{matrix} \cos 4\alpha-\cos3\alpha+\cos2\alpha-\cos\alpha & \sin 4\alpha-\sin3\alpha+\sin2\alpha-\sin\alpha \\ -(\sin 4\alpha-\sin3\alpha+\sin2\alpha-\sin\alpha) & \cos 4\alpha-\cos3\alpha+\cos2\alpha-\cos\alpha \end{matrix} \right ] $$

Andrei
  • 37,370
1

By induction, $$ A^n = A^{n-1}A = \begin{pmatrix} \cos{(n-1)\theta} & \sin{(n-1)\theta} \\ -\sin{(n-1)\theta} & \cos{(n-1)\theta} \end{pmatrix} \begin{pmatrix} \cos{\theta} & \sin{\theta} \\ -\sin{\theta} & \cos{\theta} \end{pmatrix} = \begin{pmatrix} \cos{(n-1)\theta}\cos{\theta} - \sin{\theta}\sin{(n-1)\theta} & \sin{(n-1)\theta}\cos{\theta} + \cos{(n-1)\theta}\sin{\theta} \\ -\sin{(n-1)\theta} & \cos{(n-1)\theta}\cos{\theta} - \sin{\theta}\sin{(n-1)\theta} \end{pmatrix} \\ = \begin{pmatrix} \cos{n\theta} & \sin{n\theta} \\ -\sin{n\theta} & \cos{n\theta} \end{pmatrix}. $$

The prosthaphaeresis formulae are $$ \sin{A}+\sin{B} = 2\sin{\tfrac{1}{2}(A+B)}\cos{\tfrac{1}{2}(A-B)} \\ \sin{A}-\sin{B} = 2\cos{\tfrac{1}{2}(A+B)}\sin{\tfrac{1}{2}(A-B)}\\ \cos{A}+\cos{B} = 2\cos{\tfrac{1}{2}(A+B)}\cos{\tfrac{1}{2}(A-B)}\\ \cos{A}-\cos{B} = -2\sin{\tfrac{1}{2}(A+B)}\sin{\tfrac{1}{2}(A-B)}. $$

Thus $$ \cos{4\theta}-\cos{3\theta} = -2\sin{\tfrac{7}{2}\theta}\sin{\tfrac{1}{2}\theta}, \\ \cos{2\theta}-\cos{\theta} = -2\sin{\tfrac{3}{2}\theta}\sin{\tfrac{1}{2}\theta}, $$ and adding gives $$ \cos{4\theta}-\cos{3\theta} + \cos{2\theta}-\cos{\theta} = -2\sin{\tfrac{1}{2}\theta} ( \sin{\tfrac{7}{2}\theta} + \sin{\tfrac{3}{2}\theta} ) \\ = -4\sin{\tfrac{1}{2}\theta}\sin{\tfrac{5}{2}\theta}\cos{\theta}. $$

Similarly, $$ \sin{4\theta}-\sin{3\theta} + \sin{2\theta}-\sin{\theta} = 4\sin{\tfrac{1}{2}\theta}\cos{\tfrac{5}{2}\theta}\cos{\theta}, $$ so $$ B = 4\sin{\tfrac{1}{2}\theta}\cos{\theta} \begin{pmatrix} -\sin{\tfrac{5}{2}\theta} & -\cos{\tfrac{5}{2}\theta} \\ \cos{\tfrac{5}{2}\theta} & -\sin{\tfrac{5}{2}\theta} \end{pmatrix}. $$

Chappers
  • 67,606
1

Consider $z=e^{i\alpha}$; then $$ z^4-z^3+z^2-z=z(z^2+1)(z-1) $$ For simplicity, set $\alpha=2\beta$, so we have $$ e^{2i\beta}(e^{4i\beta}+1)(e^{2i\beta}-1)= e^{2i\beta}e^{2i\beta}(e^{2i\beta}+e^{-2i\beta})e^{i\beta}(e^{i\beta}+e^{-i\beta})= 4e^{5i\beta}\cos2\beta\cos\beta $$ Thus your matrix $B$ is $$ B=4\cos2\beta\cos\beta \begin{bmatrix} \cos5\beta & \sin5\beta \\ -\sin5\beta & \cos5\beta \end{bmatrix} $$ Why does it work? The set of real matrices of the form $$ \begin{bmatrix} a & b \\ -b & a \end{bmatrix} $$ under matrix addition and multiplication is isomorphic to the complex field (the matrix above corresponds to the complex number $a+bi$) and, in this isomorphism $e^{i\alpha}$ corresponds to the given rotation matrix.

egreg
  • 238,574
1

If you want the quick method scroll all the way down.

This method requires relatively little calculations but you must know eigenvalues, eigenvectors and the diagonalization of matrices theorem.

First, prove that $A$ is diagonalizable:

Let $$A = [\; \mathbf{a_1} \;\; \mathbf{a_2} \;]$$

Then

$$\mathbf{a_1} · \mathbf{a_2} = \text{cos}(\alpha)\text{sin}(\alpha) - \text{sin}(\alpha) \text{cos}(\alpha) = 0$$

And

$$\Vert \mathbf{a_1} \Vert = \Vert \mathbf{a_2} \Vert = \text{cos}^2(\alpha) + \text{sin}^2(\alpha) = 1$$

This is a trigonometric identity.

Both of these conditions make $A$ diagonalizable, or in other words, there exists an invertible $2 \times 2$ matrix $Q$ such that $Q^{-1} = Q^{T}$ Such that

$$QAQ^T = D$$

Where

$D$ is a diagonal matrix with eigenvalues on the diagonal.

$Q$ is given by the eigenvectors of $A$ and the order of the columns of $Q$ ($\Rightarrow$ eigenvectors of $A$) correspond to the eigenvalues on the diagonal of $D$. So using the characteristic equation of $A$, we find the eigenvalues:

$$\text{det}(A-\lambda I) = (\text{cos}(\alpha) - \lambda)^2 - (-\text{sin}^2(\alpha)) = \lambda^2 - 2\lambda \text{cos}(\alpha)+\text{cos}^2(\alpha) + \text{sin}^2(\alpha)$$

$$\Rightarrow \lambda = \frac {2\text{cos}(\alpha) \pm \sqrt{4\text{cos}^2(\alpha) - 4\text{cos}^2(\alpha) - 4\text{sin}^2(\alpha)}}{2} = \text{cos}(\alpha) \pm i\;\text{sin}(\alpha)$$

Now, for $\lambda = \text{cos}(\alpha) + i\;\text{sin}(\alpha)$:

$$A-\lambda I = \mathbf{0} \Rightarrow \begin{bmatrix} \text{cos}(\alpha) - (\text{cos}(\alpha) \pm i\;\text{sin}(\alpha)) & \text{sin}(\alpha) \\ -\text{sin}(\alpha) & \text{cos}(\alpha) - (\text{cos}(\alpha) \pm i\;\text{sin}(\alpha)) \\ \end{bmatrix} = \begin{bmatrix} \mp i\;\text{sin}(\alpha) & \text{sin}(\alpha) \\ -\text{sin}(\alpha) & \mp i\;\text{sin}(\alpha) \\ \end{bmatrix} = \mathbf{0} $$

The first case yields $$\begin{bmatrix} -i\;\text{sin}(\alpha) & \text{sin}(\alpha) \\ -\text{sin}(\alpha) & -i\;\text{sin}(\alpha) \\ \end{bmatrix} = \mathbf{0}$$

Through Gaussian Elimination (specifically $iR_2 - R_1$)

$$\begin{bmatrix} -\text{sin}(\alpha) & \text{sin}(\alpha) \\ 0 & 0 \\ \end{bmatrix} = \mathbf{0}$$

So the first eigenvalue of $A$ has the form $\mathbf{u} = \begin{bmatrix} u_1 \\ u_2 \\ \end{bmatrix}$

With the property that $i\;\text{sin}(\alpha)u_1 = \text{sin} u_2 \Rightarrow iu_1 = u_2 \Rightarrow \mathbf{u} = \begin{bmatrix} i \\ 1 \\ \end{bmatrix}$

The second case yields $$\begin{bmatrix} i\;\text{sin}(\alpha) & \text{sin}(\alpha) \\ -\text{sin}(\alpha) & i\;\text{sin}(\alpha) \\ \end{bmatrix} = \mathbf{0}$$

Through Gaussian Elimination (specifically $iR_2 + R_1$)

$$\Rightarrow \begin{bmatrix} i\;\text{sin}(\alpha) & \text{sin}(\alpha) \\ 0 & 0 \\ \end{bmatrix} = \mathbf{0} $$

So the first eigenvalue of $A$ has the form $\mathbf{u} = \begin{bmatrix} u_1 \\ u_2 \\ \end{bmatrix}$

With the property that $-i\;\text{sin}(\alpha)u_1 = \text{sin} u_2 \Rightarrow iu_1 = u_2 \Rightarrow \mathbf{u} = \begin{bmatrix} -i \\ 1 \\ \end{bmatrix}$

By the diagonalization theorem, we now know that:

$$ A = QDQ^T = \begin{bmatrix} i & -i \\ 1 & 1 \\ \end{bmatrix} \begin{bmatrix} \text{cos}(\alpha) + i\;\text{sin}(\alpha) & 0 \\ 0 & \text{cos}(\alpha) - i\;\text{sin}(\alpha) \\ \end{bmatrix} \begin{bmatrix} i & 1 \\ -i & 1 \\ \end{bmatrix}$$

Another theorem states that $A^n = QD^nQ^T$. A proper proof would require mathematical induction but here's a quick demonstration:

$$A^2 = (QDQ^T)(QDQ^T) = QD(Q^TQ)DQ^T$$

Because $Q^T = Q^{-1} \Rightarrow Q^TQ = Q^{-1}Q = I$,

$$A^2 = \cdots = QDIDQ^T = QDDQ^T = QD^2Q^T$$

It is not hard to see that this works $\forall n \in \mathbb{N}$, not only for $n = 1$ and $n = 2$.

It also holds that for a diagonal matrix $D_{n \times n} = \text{diagonal}(d_1, \cdots, d_n) \Rightarrow D^n = \text{diagonal}(d_1^n, \cdots, d_n^n)$

Therefore:

$$A^n = Q \begin{bmatrix} (\text{cos}(\alpha) + i\;\text{sin}(\alpha))^n & 0 \\ 0 & (\text{cos}(\alpha) - i\;\text{sin}(\alpha))^n \\ \end{bmatrix} Q^T$$

So:

$$A^2 = Q \begin{bmatrix} (\text{cos}(\alpha) + i\;\text{sin}(\alpha))^2 & 0 \\ 0 & (\text{cos}(\alpha) - i\;\text{sin}(\alpha))^2 \\ \end{bmatrix} Q^T = Q \begin{bmatrix} \text{cos}^2(\alpha) + 2i\;\text{cos}(\alpha)\text{sin}(\alpha) - \text{sin}(\alpha) & 0 \\ 0 & \text{cos}^2(\alpha) - 2i\;\text{cos}(\alpha)\text{sin}(\alpha) + \text{sin}(\alpha) \\ \end{bmatrix} Q^T$$

$$A^3 = Q \begin{bmatrix} (\text{cos}(\alpha) + i\;\text{sin}(\alpha))^3 & 0 \\ 0 & (\text{cos}(\alpha) - i\;\text{sin}(\alpha))^3 \\ \end{bmatrix} Q^T$$

$$A^4 = Q \begin{bmatrix} (\text{cos}(\alpha) + i\;\text{sin}(\alpha))^4 & 0 \\ 0 & (\text{cos}(\alpha) - i\;\text{sin}(\alpha))^4 \\ \end{bmatrix} Q^T$$

Simplify that and then find $B$.

Or alternatively, note that

$$B = A^4 - A^3 + A^2 - A = A^4 + A^2 - A^3 - A = A^4 + A^2 - (A^3 + A) = A^2(A^2 + I) - A(A^2 + I) = (A^2 - A)(A + I)$$

And this could have been solve much quicker with literally two matrix multiplications (one for $A^2$ and another one for $(A^2 - A)(A + I)$).

I realized you can do it like this halfway through my explanation of the other method and it's already really long so I won't erase it.