If so, can I define an inner product in our usual vector space where two vectors orthogonal with respect to the dot product are no longer orthogonal in the new inner product I define?
-
2Yes and yes. For any basis ${e_1, \ldots, e_n}$, denote by $v^1, \ldots, v^n$ the coefficient of a vector $\vec{v}$ with respect to the basis. Define your inner product as the mapping $(v,w) \mapsto \sum v^i w^i$. The mapping is different for distinct bases. – Willie Wong May 07 '18 at 18:07
-
I'll guess that it will be the same in finite dimensional cases and more difficult in others – SK19 May 07 '18 at 18:07
-
For an interesting example, take $V=\Bbb R[X]$ and the inner products $\langle \sum a_kX^k,\sum b_k X^k\rangle = \sum a_kb_k$ and $\langle f,g\rangle = \int_0^1 f(t)g(t),\mathrm dt$. – Hagen von Eitzen May 07 '18 at 18:33
-
@HagenvonEitzen , i have seen that inner product before but i wanted to ask how do we come up with such definitions? like i know there are axioms that need to be satisfied, but even then how de we come up with how the final inner product will look like? – F.N. Mar 19 '21 at 06:53
5 Answers
Sure. Let $V=\mathbb R^3$ be a real vector space with standard basis $e_1,e_2,e_3$. Define $f:V \to \mathbb R^3$ by $(e_1,e_2,e_3) \to (e_1,e_2,2e_3)$ and take the standard inner product (dot product) in $\mathbb R^3$ $\langle \cdot, \cdot \rangle_3$ for $\mathbb R^3$. Then, we can define $\langle x,y \rangle_*:= \langle f(x),f(y) \rangle_3$.
Note that $(1,0,1)$ and $(4,0,-1)$ are orthogonal in the inner product with subscript $*,$ i.e., $\langle \cdot,\cdot\rangle_*,$ but not in the standard inner product. Likewise, $(1,1,-2)$ and $(1,1,1)$ are orthogonal in the standard inner product, but not in the inner product with subscript $*.$

- 762

- 20,977
Let $V$ denote your vector space, $M:V\to V$ a positive-definite matrix not proportional to the identity, and $\langle u|v\rangle$ an inner product of $V$. Then $\langle u|Mv\rangle$ is another one.

- 115,835
You mean: Define a new / second inner product such that two vectors orthogonal with respect to the first / old inner product are not orthogonal with respect to the new / second one?
Yes! That would be possible..
If you do not need any practical motivation take the vector space $\mathbb{C}^n$ and any positive definite hermitian matrix $A$. Then $x^{*}Ax$ is a scalar product, where $x^{*}$ means conjugate transpose. Now consider first $A$ as the identity matrix and then $\tilde{A}$ as identical to $A$ except for the upper left coefficient changed to 2...
-
You are right, I elaborated - I will leave "the question for clarification" in though, as to make clear what the precise scope of my answer would be. – mol3574710n0fN074710n May 07 '18 at 18:22
-
Wait, are you asking for two dot products $<,>_1$ and $<,>_2$ where there exists $x,y$ both nonzero where both $<x,y>_1 =0$ and $<x,y>_2 \not = 0$ are satisfied, or are you asking for $<,>_1$ $<,>_2$ such that for all nonzero $x,y$ satisfying $<x,y>_1 =0$, their other product $<x,y>_2$ is necessarily nonzero.
The former is easy, for all $x,y \in \mathbb{R}_n$, let $<x,y>_1$ $=\sum_i x_iy_i$ where $w_i$ denotes the value of the $i$-th coordinate of $w \in \mathbb{R}_n$, and let $<x,y>_2$ $=\sum_i a_ix_iy_i$ where $a_i$ are scalars not uniformly 1 (they can be normalized if you require $<x,x>_1 = <x,x>_2$).
The latter is impossible, at least for $n \geq 3$. Indeed, let $y$ and $z$ be two linearly independent nonzero vectors satisfying $<x,y>_1 =$ $<x,z>_1 = 0$. [Make sure you see for yourself that such $y$ and $z$ exist.] Then let us write $a = <x,y>_2$ and $b = <x,z>_2$. If either $a$ or $b$ are 0 then we have already shown $<,>_2$ doesn't satisfy the latter property; otherwise let $w=az-by \in \mathbb{R}_n$. Then as $w$ is a linear combination of $y$ and $z$ which are linearly independent and nonzero, then $w$ is nonzero and satisfies $<x,w>_1 =$ $a<x,z>_1 - b<x,y>_1$ $=a0+b0$ $=0$. But also $<x,w>_2 =$ $a<x,z>_1 - b<x,y>_1$ $=ab-ba = 0$.

- 20,434
Given a basis $\{ f_n \}$ of a finite-dimensional vector space $V$ over the real or complex numbers, you can define an inner product on $V$ so that $\{ f_n \}$ is an orthonormal basis. To do this, write $$ x = \sum_{n=1}^{N}\alpha_n f_n,\;\;\; y =\sum_{n=1}^{N}\beta_n f_n, $$ and define $$ \langle x,y\rangle = \sum_{n=1}^{\infty}\alpha_n \beta_n. $$ (Conjugate the $\beta_n$ if the space is complex.) This is an inner product where $\{ f_n \}$ is an orthonormal basis.
If you have two vectors $f_1,f_2$ that are orthogonal with respect to $\langle\cdot,\cdot\rangle_1$, define a new inner product $\langle\cdot,\cdot\rangle_2$ where $\langle f_1,f_1+f_2\rangle_2=0$. Then you know $\langle f_1,f_2\rangle_2 \ne 0$ because that would imply $\langle f_1,f_1\rangle_2=0$.

- 87,459
- 5
- 65
- 149