We currently have rings and groups in school and I'm fine proving the group axioms except the proof for an operator to be well defined. I tried to solve two exercises I found in the internet, but I don't have a clue at what to do. 1) Proof that the matrix multiplication is well defined for $\begin{pmatrix} a & -b \\ b & a \end{pmatrix} \in \mathbb{R}^{2 \times 2} ~$for$~ a,b \in \mathbb{R} ~$and$~ a^2 + b^2 = 1$. 2) Proof that $\phi:\mathbb{Z}_{mn}\rightarrow \mathbb{Z}_{m} \times \mathbb{Z}_{n}, z \rightarrow ([z]_m,[z]_n)$ is well defined. I tried the first one like that: $\begin{pmatrix} a & -b \\ b & a \end{pmatrix} \cdot \begin{pmatrix} a' & -b' \\ b' & a' \end{pmatrix} = \begin{pmatrix} aa' -bb' & a(-b')-ba' \\ ba' + ab' & b(-b') + aa' \end{pmatrix}$ but I don't know how what to do now. On the second one I don't even know how to start. Any help or hints are appreciated.
3 Answers
There are basically two cases where a given 'human-understandable' operation might not be well-defined:
- The first is a question of domain and codomain. It may be possible that an function is not defined because it fails to land in the given co-domain, or because it is a composite of functions where the range of one fails to fall into the domain of the second. Your first example is of this sort. To show that it is well defined, show that the resulting matrix from your multiplication lands within the set of matrices you are considering: i.e. its '$a$' and its '$b$' have a sum of squares equalling $1$. (It may help to consider other invariants of matrices you know, like the determinant, but if that fails you can fall back on brute calculation to prove it)
- The second case is where equivalence relations are concerned. A function $f(a)$ may not be well defined because $a$ was a representative of a class of elements, and the result would have been different if a different representative had been chosen (It should have been the same for the entire class of equivalent elements). The classic example is that the operation on fractions $\frac{a}{b}\ \oplus \frac{c}{d}= \frac{a+c}{b+d}$ isn't well defined, because $\frac{1}{2} \oplus \frac{1}{1} = \frac{2}{3}$ but $\frac{1}{2}=\frac{2}{4}$ and $\frac{2}{4} \oplus \frac{1}{1} = \frac{3}{5} \neq \frac{2}{3}$. Your second example is of this form. The elements of $\mathbb{Z}_{nm}$ are classes of integers all differing by multiples of $nm$. You must prove that even if a different representative $z'$ was chosen, differing from the original by a multiple of $nm$, the resulting $f(z')$ will still be the same.
More info about the second case: So, in $\mathbb{Z}_{nm}$ and only in $\mathbb{Z}_{nm}$ we know that $z + kmn$ and $z$ are equal. This is exactly why we expect them to be equal under the function. But we are only capable of representing them by integers from $\mathbb{Z}$, and if they aren't equal in $\mathbb{Z}$ then the formula we've given might not give equal answers for both. This is why it might not be well defined.
Consider the 'function' $f: \mathbb{Z}_3 \rightarrow \{0,1\}$ with $f(z)=0$ if it is even and $f(z) = 1$ if it is odd. When you understand why this isn't well-defined, you'll understand what you need to prove in the second case.

- 5,448
-
So $z' = mn \cdot z$. Then $[z']{mn}$ = $[z \cdot mn]{mn}$ = $[z]_{mn}$. is that correct? – Vajk Dec 07 '17 at 20:21
-
No: $z'=z+kmn$, where k gives you whatever multiple of $mn$ the difference between them can be. – Chessanator Dec 07 '17 at 20:22
-
$z' = z + kmn$, so $[z']{mn} = [z + kmn]{mn} = [z]_{mn}$? – Vajk Dec 07 '17 at 20:32
-
I'll add more explanation to the post. – Chessanator Dec 07 '17 at 20:33
Hints:
For 1), it is of course well defined as a matrix multiplication. All you have to do is to check the product satisfies the defining conditions of this set of matrices.
For 2), you have to prove that, if $[z]_{mn}=[z]_{mn}$, then $[z]_{m}=[z]_{m}$ and $[z]_{n}=[z]_{n}$.

- 175,478
1) Multiply two matrices of this form, and verify that the product again satisfies the condition, i.e., that it is again of this form with $a',b'$ satisfying $a'^2+b'^2=1$. Then 2) comes automatically in the same spirit.
Edit: For 1), use that such matrices correspond to complex numbers $z=a+bi$ with $|z|=a^2+b^2=1$:
Then the multiplication of such matrices corresponds to multiplication of $z$ and $w$, and of course $$ |zw|=|z|\cdot |w|=1\cdot 1=1. $$

- 130,978
-
I already tried to show that $a'^2 + b'^2 = 1$ but I'm somewhat stuck, you got any more hints? – Vajk Dec 07 '17 at 20:07