0

Its a general question: What is the right way to find a basis for subspaces.

I found questions that are similar but the answers, in my opinion, dont answer the question, the question asks to find subspace, the asnwer just check if the vectors are linear independent: Find bases for subspaces spanned by vectors.


I will ask it throught a question i try to answer.

Let $U_1, U_2 $ be sub spaces in $R_4[x]$, such that:

$$U_1 = Sp\{x^3+2x^2+3x+6, 4x^3-x^2+3x+6, 5x^3+x^2+6x+12\}$$ $$U_2 = Sp\{x^3-x^2+x+1,2x^3-x^2+4x+5\}$$

Find the base and dimension of $U_1+U_2.$


$U_1+U_2 = Sp\{x^3+2x^2+3x+6, 4x^3-x^2+3x+6, 5x^3+x^2+6x+12,x^3-x^2+x+1,2x^3-x^2+4x+5\}$

(I know i just did $\cup$ of the subspaces and not realy + but that is from a sentence for sum of spanning sets, that + = $\cup$)

Now i want to find the linear independent vectors in the spanning set, they will be the basis.

We will look at the coordinates of the spanning set, regarding the standard basis: $B = \{x^3,x^2,x,1\}$ and put the vectors as rows in a matrix and rank the matrix, each row that wont be zero will define an independent vector. The group of linear independent vectors - will be the basis for $U_1+U_2$.

$$\begin{bmatrix}1&2&3&6 \\ 4&-1&3&6\\5&1&6&12\\1&-1&1&1\\2&-1&4&5\end{bmatrix} \xrightarrow{rank}\begin{bmatrix}1&2&3&6 \\ 0&-9&-9&-18\\0&0&1&1\\0&0&0&0\\0&0&0&0\end{bmatrix}$$

Therefore, $B_{(U_1+U_2)} = \{x^3+2x^2+3x+6, -9x^2-9x-18, x+1\}$

Do i have mistakes?

Maybe i could find seperatly first the basis for $U_1$ and $U_2$ and than do $B_{U_1}+B_{U_2} = B_{(U_1+U_2)}$?

And can someone explain why taking the coordinates and find the corresponding linear independent vectors for them, is equally like dealing with the vectors themselves (maybe because the coordinate coresponding to the basis represent uniquely the vector?)

Alon
  • 1,647

1 Answers1

1

To answer your questions:

Do I have mistakes?

Your writing is a bit unclear and has some grammatical/spelling mistakes. That being said, in terms of the computations you have done and the conclusions you have reached, everything is correct (as far as I can tell). Well done.

Maybe i could find seperatly first the basis for $U_1$ and $U_2$ and than do $B_{U_1}+ B_{U_2} = _{U_1 + U_2}$?

I think that you are asking whether the set $B_{U_1} \cup B_{U_2}$ will be a basis for $U_1 + U_2$. This is correct if and only if $U_1 \cap U_2 = \{0\}$. So, the method that you have used is a better choice.

And can someone explain why taking the coordinates and find the corresponding linear independent vectors for them, is equally like dealing with the vectors themselves (maybe because the coordinate coresponding to the basis represent uniquely the vector?)

You have the correct idea. Here is a formal statement and proof in response to the question that I think you are asking.

Claim: Suppose that $\mathcal B = \{v_1,\dots,v_n\}$ is a basis of $U$. Let $w_1,\dots,w_k$ be vectors for which for $j = 1,\dots,k,$ we have $$ w_{j} = a_{j1} v_1 + \cdots + a_{jn} w_n. $$ We call the vector $\alpha_j = (a_{j1}, \dots, a_{jn}) \in \Bbb R^n$ the coordinate vector of $w_j$ relative to $\mathcal B$. Then the vectors $w_1,\dots,w_k$ are linearly independent if and only if the corresponding coordinate vectors $\alpha_1, \dots ,\alpha_k$ are linearly independent.

Proof: Consider the following series of equivalent statements: $$ w_1,\dots,w_k\text{ are linearly dependent } \iff\\ \text{there are coefficients }c_1,\dots, c_k \text{ (not all zero) such that } \sum_{p=1}^k c_p w_p = 0 \iff\\ \text{there are coefficients }c_1,\dots, c_k \text{ (not all zero) such that } \sum_{p=1}^k c_p \sum_{q=1}^n a_{pq} v_q = 0 \iff\\ \text{there are coefficients }c_1,\dots, c_k \text{ (not all zero) such that } \sum_{q=1}^n \left(\sum_{p=1}^k c_p a_{pq}\right) v_q = 0 \iff\\ \text{there are coefficients }c_1,\dots, c_k \text{ (not all zero) such that } \sum_{p=1}^k c_p a_{pq} = 0 \text{ for } q = 1,\dots,n \iff \\ \text{there are coefficients }c_1,\dots, c_k \text{ (not all zero) such that } \left(\sum_{p=1}^k c_p a_{p1}, \dots ,\sum_{p=1}^k c_p a_{pn}\right) = 0 \iff \\ \text{there are coefficients }c_1,\dots, c_k \text{ (not all zero) such that } \sum_{p=1}^k c_p \left(a_{p1}, \dots , a_{pn}\right) = 0 \iff\\ \text{there are coefficients }c_1,\dots, c_k \text{ (not all zero) such that } \sum_{p=1}^k c_p \alpha_p = 0 \iff\\ \alpha_1,\dots,\alpha_k\text{ are linearly dependent. } $$

Ben Grossmann
  • 225,327