1

Let $M$ be a $n\times n$ upper triangular matrix s.t. diagonal components are $1$. $\Bigg($For example, $M=\begin{pmatrix} 1 & 3 & 2 \\ 0 & 1 & 5 \\ 0 & 0 & 1 \end{pmatrix}, \text{when } n=3.$$\Bigg)$.

And let $\mu$ be Lebesgue measure on $\mathbb{R^n}$ and $E \subset\mathbb{R^n}$ be a Lebesgue measurable set.

Then, prove that $\mu(ME)=\mu(E)$, where $ME=\left\{ Mx \in \mathbb{R^n} \mid x\in E \ \right\}$ by using Fubini-Tonelli Theorem.

I know that $\mu(ME)=\displaystyle\int_{\mathbb{R^n}} \chi_{ME}(x) dx$ and $\mu(E)=\displaystyle\int_{\mathbb{R^n}} \chi_{E}(x) dx$ but I don't come up with an idea how I can prove these are equal using Fubini-Tonelli.

This is a step in eventually proving that $\mu (ME) = |\det M| \mu(E)$ for any linear map $M$.

Arctic Char
  • 16,007
daㅤ
  • 3,264

1 Answers1

1

Let $U$ be the set of such matrices. Geometry is the name of the game here, so consider what happens when you apply $M \in U $ to $[0,1]^n$.

This amounts to observing $M$'s effect on the $e_i$, which produces a vector of the form:

$$\begin{bmatrix} 1 + \sum_{j=2}^n a_{1,j} \\ 1 + \sum_{j=3}^n a_{2,j} \\ \vdots \\ 1 \end{bmatrix}$$

More generally:

$$Mx = \begin{bmatrix} x_1 + \sum_{j=2}^n a_{1,j}x_j \\ x_2 + \sum_{j=3}^n a_{2,j}x_j \\ \vdots \\ x_ n \end{bmatrix}$$

where $x = (x_1,\dots,x_n)$.

In other words, $M$ sends $x_n$ to $x_n$, $x_{n-1}$ to a linear combination of $x_n$ and $x_{n-1}$, and overall $x_i$ is sent to a linear combination of the $x_j$, $i \leqslant j \leqslant n$. One way to think of this is as a descending chain of subspaces.

In the $n=2$ case, this is just sending the square to a parallelogram.

As is the case with multivariable analysis, the $n=2$ case is notationally the easiest, but in this case is also conceptually the simplest.

Proposition: $\mu(MS) = \mu(S)$, when $M \in U \subset \mathbb{R}^{2 \times 2}$ and $S \subseteq \mathbb{R}^2$.

Proof: The proposition is found in the following series of equalities:

\begin{align*} \mu(MS) &= \iint_{[a,b] \times [c,d]} \chi_{MS}(x,y) d(x,y)\\ &= \int_c^d \int_a^b \chi_{MS}(x,y)dxdy \tag{Fubini}\\ &= \int_a^b \left( \int_c^d \chi_{MS}(x,y)dy\right)dx \tag{Fubini again} \\ &= \int_a^b \left( \int_c^d \chi_{S}(x,y)dy \right) dx \tag{since $M$ is $\text{id}$ on $y$}\\ &= \iint_{[a,b]\times [c,d]} \chi_S(x,y)d(x,y) \tag{Fubini *again*} \end{align*}

$\blacksquare$

The rest is below. Take a moment to think about how you could extend this technique before proceeding.


However, what is a $3D$ parallelopiped but uncountably many $2D$ sets "glued" together? This suggests induction, since we just proved that the $2D$ measure is invariant under $M \in U$.

Proposition: Let $M \in U \subset \mathbb{R}^{n \times n}$ and $S \subseteq \mathbb{R}^n$. Then $\mu(MS) = \mu(S)$.

Proof:

We prove by induction on $n$, the dimension of the space. The base case $n=1$ is just $\mu(S) = \mu(IS)$, which is clear.

Consider $n=k$. Then:

\begin{align*} \mu(MS) &= \idotsint_{\prod_{i=1}^k[a_i,b_i]} \chi_{MS}(x_1,\dots,x_k) d(x_1,\dots,x_k)\\ &= \int_{a_1}^{b_1} \left(\idotsint_{\prod_{i=1}^{k-1}[a_i,b_i]} \chi_{MS}(x_1,x_2,\dots,x_k)dx_2\dots dx_k\right)dx_1\\ &= \int_{a_1}^{b_1} \left( \idotsint_{\prod_{i=1}^{k-1}[a_i,b_i]} \chi_{S}(x_1,x_2,\dots,x_k)dx_2\dots dx_k \right) dx_1 \\ &= \idotsint_{\prod_{i=1}^k[a_i,b_i]} \chi_{S}(x_1,\dots,x_k) d(x_1,\dots,x_k)\\ &= \mu(S) \end{align*}

$\blacksquare$

The fact that this works can be seen as a consequence of the fact that the last $n-1$ rows and columns of an $n \times n$ upper triangular unital-diagonal matrix is in fact also such a matrix. That submatrix "doesn't touch" the first variable, so induction is nicely contained.

While I Am
  • 2,380