This is the problem I'm dealing with:
Let $\sigma_1,\dots,\sigma_n \in \mathbb{R}$ and $b_1,\dots,b_n$ be column vectors of length $n$. Consider the system $$ (A - \sigma_jI)x_j = b_j, \quad (j=1,\dots,n).$$ Show this can be solved in $2n^3$ flops neglecting terms of order $n^2$ and lower.
My progress: Since LU decomposition takes $2n^3/3$ flops in general, we can't solve the system $n$ times. My first idea is to somehow link the LU decomposition of $A$ with $A - \sigma_jI$, which we can do with the relationship $$ LU - \sigma_j I = L_jU_j$$ If we define $L_jU_j$ as the LU decomposition of $A - \sigma_j I$. This is where I'm stuck. The expressions we have are the product of $L$ and $U$, which makes it difficult to derive an expression for either of the two. If we have $L,U$ as given, how can we find $L_jU_j$ without doing another decomposition?