Definition: Let $(V,F,+,\cdot)$ be a vector space over $F$. $(V,F,+,\cdot,\times)$ is linear algebra over $F$ if $\times:V\times V\to V$ have following properties:
$(1)$ $\alpha \times (\beta \times \gamma)$ $=(\alpha \times \beta)\times \gamma$, for all $\alpha, \beta, \gamma\in V$
$(2)$ $\alpha \times (\beta+\gamma)$ $=\alpha \times \beta + \alpha \times \gamma$ and $(\alpha +\beta)\times \gamma$ $=\alpha \times \gamma +\beta \times \gamma$, for all $\alpha, \beta, \gamma\in V$
(3) $c\cdot (\alpha \times \beta)$ $=(c\cdot \alpha)\times \beta$ $=\alpha \times (c\cdot \beta)$, for all $c\in F$.
If $\exists 1_V\in V$ such that $1_V \times \alpha$ $=\alpha \times 1_V$ $=\alpha$, for all $\alpha \in V$, then we say $(V,F,+,\cdot, \times)$ is linear algebra with identity over $F$. If $\alpha \times \beta$ $=\beta \times \alpha$, for all $\alpha ,\beta \in V$, then we say $V$ is commutative.
Define vector multiplication operation on $F^\infty$ by $(f\times g)_n$ $=\sum_{i=0}^n f_i\cdot g_{n-i}$, $\forall n\geq 0$. For notation simplicity, I will denote $\times$ by $\cdot$ from context it is clear which operation we’re using.
Prove $F^\infty$ $=F^\Bbb{N}$ $=\{(x_i)_{i\geq 0}|x_i\in F\}$ is commutative linear algebra with identity over $F$.
Hoffman’s proof: Let $f,g\in F^\infty$. Then $(f\cdot g)_n$ $=\sum_{i=0}^n f_i\cdot g_{n-i}$ $= \sum_{i=0}^n g_i\cdot f_{n-i}$ $=(g\cdot f)_n$, $\forall n\geq 0$. Que: How to rigorously prove $\sum_{i=0}^n f_i\cdot g_{n-i}$ $= \sum_{i=0}^n g_i\cdot f_{n-i}$. I think, we have to shift index in a certain manner. I can’t recall exactly.
We show $F[x]$ is linear algebra over $F$. $(1)$ Let $f,g,h\in F^\infty$. Then $[(f\cdot g)\cdot h]_n$ $=\sum_{i=0}^n (f\cdot g)_i\cdot h_{n-i}$ $=\sum_{i=0}^n \sum_{j=0}^i f_j\cdot g_{i-j}\cdot h_{n-i}$ $=\sum_{j=0}^n f_j\sum_{i=0}^{n-j} g_i \cdot h_{n-j-i}$ $= \sum_{j=0}^n f_j \cdot (g\cdot h)_{n-j}$ $=[f\cdot (g\cdot h)]_n$, $\forall n\geq 0$. Que: I don’t understand $\sum_{i=0}^n \sum_{j=0}^i f_j\cdot g_{i-j}\cdot h_{n-i}$ $=\sum_{j=0}^n f_j\sum_{i=0}^{n-j} g_i \cdot h_{n-j-i}$ step. $(2)$ Let $f,g,h\in F^\infty$. Then $[f\cdot (g+h)]_n$ $=\sum_{i=0}^nf_i\cdot (g+h)_{n-i}$ $= \sum_{i=0}^nf_i\cdot (g_{n-i}+h_{n-i})$ $= \sum_{i=0}^nf_i\cdot g_{n-i}+f_i \cdot h_{n-i}$ $= \sum_{i=0}^nf_i\cdot g_{n-i} + \sum_{i=0}^nf_i\cdot h_{n-i}$ $=(f\cdot g)_n +(f\cdot h)_n$ $=(f\cdot g+f\cdot h)_n$, $\forall n\geq 0$. Thus $f\cdot (g+h)=f\cdot g+f\cdot h$. Since vector multiplication on $F^\infty$ is commutative, we have $f\cdot (g+h)$ $=(g+h)\cdot f$ $=f\cdot g+f\cdot h$. $(3)$ Let $c\in F$ and $f,g\in F^\infty$. Then $[c\cdot (f\cdot g)]_n$ $=c\cdot (f\cdot g)_n$ $=c\cdot (\sum_{i=0}^n f_i\cdot g_{n-i})$ $= \sum_{i=0}^n c\cdot (f_i\cdot g_{n-i})$ $= \sum_{i=0}^n (c\cdot f_i)\cdot g_{n-i}$ $=\sum_{i=0}^n (c\cdot f)_i\cdot g_{n-i}$ $=[(c\cdot f)\cdot g]_n$, $\forall n\geq 0$. Thus $c\cdot (f\cdot g)$ $=(c\cdot f)\cdot g$. Similarly $[c\cdot (f\cdot g)]_n$ $= \sum_{i=0}^n c\cdot (f_i\cdot g_{n-i})$ $= \sum_{i=0}^n f_i\cdot (c\cdot g_{n-i})$ $\sum_{i=0}^n f_i\cdot (c\cdot g)_{n-i}$ $=[f \cdot (c\cdot g)]_n$, $\forall n\geq 0$. Thus $c\cdot (f\cdot g)$ $=f\cdot (c\cdot g)$. Hence $c\cdot (f\cdot g)$ $=(c\cdot f)\cdot g$ $=f\cdot (c\cdot g)$.
We claim $1=(1,0,0,…)$ is identity element of $F^\infty$. Let $g$ $=1$ $=(1,0,0,…)$. Then $(g\cdot f)_n$ $=g_0\cdot f_n + \sum_{i=1}^n g_i\cdot f_{n-i}$ $=1\cdot f_n+\sum_{i=1}^n 0\cdot f_{n-i}$ $=f_n$ $=(f)_n$, $\forall n\geq 0$. Thus $g\cdot f$ $=1\cdot f$ $=f$. Then $(f\cdot g)_n$ $=\sum_{i=0}^{n-1}f_i \cdot g_{n-i} +f_n \cdot g_0$. Since $i\leq n-1$, we have $n-i\geq 1$. So $g_{n-i}=0$, $\forall i \leq n-1$. So $\sum_{i=0}^{n-1}f_i \cdot g_{n-i} +f_n \cdot g_0$ $=\sum_{i=0}^{n-1}f_i \cdot 0 +f_n \cdot 1$ $=f_n$ $=(f)_n$, $\forall n\geq 0$. Thus $f\cdot g$ $=f\cdot 1$ $=f$. Hence $1\cdot f$ $=f\cdot 1$ $=f$. Hence $(F^\infty, F,+,\cdot, \times)$ is commutative linear algebra with identity over $F$. Is my proof correct?