3

The total energy of a signal $g(t)$ is $$\int_{-\infty}^{\infty} |g(t)|^2 dt$$ If the random variable $X$ has a probability density function $f(x)$, then its variance is $${\displaystyle \operatorname {Var} (X)=\int _{\mathbb {R} }x^{2}f(x)\,dx-\mu ^{2},}$$ Is there a relation of variance and energy of signal if $\mu=0$? I ask this question why I found similar topics and questions on this forum (e.g. What is the relationship between variance and energy). Moreover, on a university webpage I read "We think of the variance as the power of the non-constant signal components". These sources did not help me to understand if there is a relation between variance and energy of signal.

Snoop
  • 15,214
Mark
  • 7,841
  • 6
  • 38
  • 72

2 Answers2

5

I will make the case for signals described as sequences of random variables. Let $(X_k)_{k \in \mathbb{Z}}$ be our signal. We suppose that $(X_k)_{k \in \mathbb{Z}}$ is weakly stationary (WS), and therefore has a stationary autocovariance function $(\gamma_X(\nu))_{\nu \in \mathbb{Z}}$. We further suppose that $E[X_k]=0,\,\forall k$ for simplicity (de-meaning leads more general cases to this case). The power spectral density (PSD) is (and sometimes is defined as) the Fourier transform of the autocovariance function: $$S_X(\xi)=\sum_{k \in \mathbb{Z}}\gamma_X(k)e^{-ik\xi}$$ The autocovariance function can be recovered from the PSD by inverse FT: $$\gamma_X(k)=\frac{1}{2\pi}\int_{(-\pi,\pi]}S_X(\xi)e^{ik\xi}d\xi$$ Now notice that $\gamma_X(0)=\sigma_X^2=E[X_k^2],\,\forall k$, that is, the zero lag autocovariance is the variance. But then $$\sigma_X^2=\frac{1}{2\pi}\int_{(-\pi,\pi]}S_X(\xi)d\xi$$ So that the variance of a random WS signal is the total power of a signal. Mind variance makes real sense only when signals are random, in which case we must speak of power (an ergodic quantity of random signals) and not of energy (the $\ell^2$-integral of a deterministic signal in $\ell^2(\mathbb{Z})$).

Snoop
  • 15,214
-1

Let $X$ be a random variable with a pdf (probability density function) $f_X(x)$, mean $\mu_X = E\{X\}$, and variance $\sigma_X^2 = E\{(X-\mu_X)^2\}$. Then the following relation holds :

$$ E\{X^2\} = E\{(X-\mu_x)^2\} + E\{X\}^2 = \sigma_X^2 + \mu_x^2 \tag{1}$$

In Eq.1, the LHS $E\{X^2\}$ is known as the total power of the R.V. $X$, whereas the terms $\sigma_X^2$ and $\mu_X^2$ are known as AC and DC powers respectively.