11

Let's take $n$ vectors in $\mathbb{R}^n$ at random. What is the probability that these vectors are linearly independent? (i.e. they form a basis of $\mathbb{R}^n$)

(of course the problem is equivalent of "taken a matrix at random from $M_{\mathbb{R}}(n,n)$, what is the probability that its determinant $\neq 0$)

Don't know if this question is difficult to answer or not. Please share any information about it! :-)

(the $n$ vectors are meant with real values, I'm interested in solutions in $\mathbb{N}$ or $\mathbb{Q}$ or whatever fields you like)

TMM
  • 9,976
Ant
  • 21,098
  • 2
    What distribution are the $n$ vectors taken from? (By the way, the title has $N$ where the body has $n$.) – joriki Jun 29 '13 at 16:51
  • the n vectors are taken from R :-) – Ant Jun 29 '13 at 16:54
  • 1
    But for which distribution? – Clement C. Jun 29 '13 at 16:54
  • Distribution (sort of) must be specified. But if the components are chosen independently and at random from a continuous distribution, the probability is $1$. – André Nicolas Jun 29 '13 at 16:56
  • uniformly at random.. @AndréNicolas can you explain it? :-) – Ant Jun 29 '13 at 16:57
  • 1
    How do you draw a number uniformly at random from $R$? – TMM Jun 29 '13 at 17:00
  • 3
    @Ant There is no "ordinary" uniform probability distribution on all of $\mathbb{R}$ (or $\mathbb{R}^k$). – Lord Soth Jun 29 '13 at 17:00
  • 2
    So say uniformly on $[0,1]$. Assume that for $k$ vectors where $k\lt n$, the probability is $1$. These form a subspace of dimension $k$, and therefore of measure $0$. So with probability $1$ the next vector chosen is not in that subspace. – André Nicolas Jun 29 '13 at 17:05
  • suppose you could take uniform distribution on $S^{n-1}$ then the probability will be 1 which should be easy. trivial in $R^1$, and when you have $n-1$ vectors in $R^n$ then the probability of drawing a vector in the hyperplane they form is zero because the Lebesgue measure of this hyperplane is 0. it's still just heuristics with assumption of the probability mentioned before – mm-aops Jun 29 '13 at 17:06

2 Answers2

18

As others have pointed out the main problem is what "taking a vector at random" means. Probability theory requires that one specifies a certain probability measure on ${\mathbb R}^n$ before one can make any predictions about outcomes of experiments concerning chosen vectors. E.g., if it is totally unlikely, meaning: the probability is zero, that a vector with $x_n\ne 0$ is chosen, then the probability that $n$ vectors chosen independently are linearly independent is $\>=0$, since with probability $1$ they all lie in the plane $x_n=0$.

A reasonable starting point could be installing a rotational invariant probability measure. As the length of the $n$ chosen vectors does not affect their linear dependence or independence this means that we are chosing $n$ independent vectors uniformly distributed on the sphere $S^{n-1}$. (This informal description has a precise mathematical meaning.)

Under this hypothesis the probability that the $n$ chosen vectors $X_k$ are linearly independent is $=1$.

Proof. The first vector $X_1$ is linearly independent with probability $1$, as $|X_1|=1$. Assume that $1< r\leq n$ and that the first $r-1$ vectors are linearly independent with probability $1$. Then with probability $1$ these $r-1$ vectors span a subspace $V$ of dimension $r-1$, which intersects $S^{n-1}$ in an $(r-2)$-dimensional "subsphere" $S_V^{r-2}$. This subsphere has $(n-1)$-dimensional measure $0$ on $S^{n-1}$. Therefore the probability that $X_r$ lies in this subsphere is zero. It follows that with probability $1$ the vectors $X_1$, $\ldots$, $X_{r-1}$, $X_r$ are linearly independent.

  • 1
    thank you! This is what I was looking for, I'm not an expert on probability so my question is rather naive, but thanks for answering :-) – Ant Jun 30 '13 at 12:50
0

This depend of how you takes the vectors. For the case $n=2$, for example, if you have a vector $v_1$, the probability that you get a vector LD with $v_1$ is $0$. So, the probability that you get two vector LI is 1.

Walner
  • 1,062