You are given two vector subspaces U = span $\left(\begin{bmatrix} 1 \cr 1 \cr 1 \cr 1 \cr 1\end{bmatrix} ,\begin{bmatrix} 1 \cr 0 \cr 0 \cr 0 \cr-1 \end{bmatrix} \right)$ and V = span $\left(\begin{bmatrix} -1 \cr -1 \cr0 \cr0\cr1 \end{bmatrix}, \begin{bmatrix} 3 \cr 2 \cr2 \cr 2\cr1 \end{bmatrix}\begin{bmatrix} 2 \cr 1\cr2\cr2\cr2 \end{bmatrix} \right)$.
How would you find the basis of U+V and the basis of U∩V?
I have problems with the linearly dependent vector in the span of V. When am I able to remove this vector and perform Gauss elimination of a 5x4 matrix instead of the whole 5x5 matrix (for the sum or intersection?)