I am trying to compute the character of $\Lambda^k V$ and Sym$^k V$ for V an arbitrary representation of a group G. I already know how the characters look for $k=2$, but I cannot find a way to generalize that. My attempt would be to find a way to decompose $\Lambda^k V$ and Sym$^k v$ into powers of 2 and 1, but I am not sure that is possible. Does someone have an idea to solve that?
-
Could you tell us what your result for $k = 2$ is? I'm not sure if I understand what you mean by "computing" the character of $\wedge^k V$ for an arbitrary representation. – Ben Grossmann Feb 19 '20 at 11:14
-
1We have $\chi_{\Lambda^2 V} (g)= \frac{1}{2} (\chi_V(g)^2 - \chi_V(g^2))$, and for the symmetric power the same expression but with + instead. – S.Farr Feb 19 '20 at 11:27
-
All right, makes sense now, thanks – Ben Grossmann Feb 19 '20 at 11:30
-
Yes, that's pretty much what I was looking for! – S.Farr Feb 19 '20 at 12:25
1 Answers
A partial answer: here is a a formula for $\chi_{\vee^kV}(g)$.
We note that $\chi_{\vee^kV}(g) = e_k(\lambda(g))$ and $\chi(g^k) = p_k(\lambda(g))$, where $e_k$ denotes the $k$th elementary symmetric polynomial, $p_i$ denotes the $i$th moment polynomial $p_i(x_1,\dots,x_n) = x_i^k + \cdots + x_n^k$, and $\lambda(g)$ denotes the vector of eigenvalues of $g$. Newton's identities state that $$ e_k = \frac 1k \sum_{i=1}^k e_{k-i} \cdot p_i. $$ Plugging in the vector of eigenvalues into both sides of the equation yields $$ \chi_{\vee^kV}(g) = \frac 1k \sum_{i=1}^k \chi_{\vee^{k-i}V}(g) \cdot \chi(g^i). $$ This gives you a recursive formula for $\chi_{\vee^kV}(g)$ with arbitrary $k$.
To complete this answer, the corresponding formula for the alternating character (according to the post linked above) is
$$ \chi_{\Lambda^k V}(g)=\frac{1}{k}\sum_{m=1}^k(-1)^{m-1}\chi_{\Lambda^{k-m}V}(g) \cdot \chi(g^m). $$
If you look further along the Newton's identities page, there are further expressions for the elementary polynomials that could be applied here. For instance, $$ e_n = \frac1{n!}\begin{vmatrix} p_1 & 1 & 0 & \cdots \\ p_2 & p_1 & 2 & 0 & \cdots \\ \vdots & & \ddots & \ddots \\ p_{n-1} & p_{n-2} & \cdots & p_1 & n-1 \\ p_n & p_{n-1} & \cdots & p_2 & p_1 \end{vmatrix} \implies\\ \chi_{\vee^n}(g) = \frac1{n!}\begin{vmatrix} \chi(g) & 1 & 0 & \cdots&0 \\ \chi(g^2) & \chi(g) & 2 & \ddots & \vdots \\ \vdots & & \ddots & \ddots &0 \\ \chi(g^{n-1}) & \chi(g^{n-2}) & \cdots & \chi(g) & n-1 \\ \chi(g^n) & \chi(g^{n-1}) & \cdots & \chi(g^2) & \chi(g) \end{vmatrix} $$

- 225,327
-
-
It's a quick consequence of the fact that $$ \otimes^k V = \wedge^k V \oplus \vee^k V. $$ If you'd like a reference, the formula is given here, and there seem to be some nice textbooks referenced in the citations. – Ben Grossmann Feb 19 '20 at 11:49
-
I had $\chi_{\otimes^k V}$ written incorrectly before; see my latest edit. – Ben Grossmann Feb 19 '20 at 11:53
-
Can you tell me how you got $\chi_{Sym^k V}(g) = e_k(\lambda (g)), and $\chi(g^k) = p_k(\lambda (g))$? – S.Farr Feb 19 '20 at 12:04
-
-
@lisyarus $\vee^k V$ is $\operatorname{Sym}^k(V)$, in contrast to $\wedge^k V$ which is $\operatorname{Alt}^k(V)$. – Ben Grossmann Feb 19 '20 at 12:09
-
Regarding those first formulas, those follow from the formulas for the eigenvalues of $\vee^k A$ and $A^k$ and the fact that the trace is the sum of eigenvalues. For a reference, see chapter 1 of Bhatia's Matrix Analysis – Ben Grossmann Feb 19 '20 at 12:11
-
If you mean $\bigvee^k V = \operatorname{Sym}^k V$, then the formula $V^{\otimes k} = \bigvee^k V \oplus \bigwedge^k V$ is wrong unless $k=2$, by dimension counting. – lisyarus Feb 19 '20 at 12:13
-
-
-
@S.Farr sorry about that; so this is a partial answer I suppose. – Ben Grossmann Feb 19 '20 at 12:18
-
@Omnomnomnom Thanks anyway, maybe the formular for the symmetric power can be derived in an analogous manner – S.Farr Feb 19 '20 at 12:20