July – summing the signal powers

A simple but important observation for the July.

Let us consider two sinusoidal signals of opposite phases during the time $t \in [0,1]$: $S_1(t) = \cos(2 \pi t)$ and $S_2(t) = \cos(2 \pi t + \pi).$ Obviously, the signals cancel each other completely and the mean power of the additive signal $S_1 + S_2$ is just 0:

  $\displaystyle \mathbb{E}[(S_1 + S_2)^2] = \int_0^1 (\cos(2 \pi t) + \cos(2 \pi t + \pi))^2dt = \int_0^10 dt = 0.$ (2)

I we were a bit silly, we could try to add the powers of the signals separately together:

$\displaystyle \mathbb{E}[S_1^2] + \mathbb{E}[S_2^2]$ $\displaystyle = \int_0^1 \cos^2(2 \pi t) dt + \int_0^1 \cos^2(2 \pi t + \pi) dt$ (3)
  $\displaystyle = \int_0^1 2 \cos^2(2 \pi t) dt= \int_0^1 \cos(4 \pi t)dt + 1 = 1,$    

but the result greatly differs from the derivation in $(2)$! Still, e.g. in point process models of wireless networks, the signal powers are often summed as in (3). The often unmentioned assumption behind this has everything to do with the independence of the signals:

let us assume that $S_1$ and $S_2$ are some random variables of mean 0 (or deterministic variables, as a deterministic variable just is a random variable with a degenerate distribution). We have that

$\displaystyle \mathbb{E}[(S_1 + S_2)^2] = \mathbb{E}[S_1^2 + S_2^2 + 2 S_1 S_2]= \mathbb{E}[S_1^2] + \mathbb{E}[S_2^2] + 2 \mathbb{E}[S_1 S_2],$ (4)

thus the identity $\mathbb{E}[(S_1 + S_2)^2] = \mathbb{E}[S_1^2] + \mathbb{E}[S_2^2] $ holds if and only if the inner-product, or the cross-correlation, $\mathbb{E}[S_1 S_2] = 0$ – that is – if the random variables (or signals) $S_1$ and $S_2$ are not correlated. Clearly, this is not the case with our initial signals, as then the cross-correlation (with time shift $\tau =0$) is given by

$\displaystyle \mathbb{E}[S_1S_2] = \int_0^1 \cos(2 \pi t) \cos(2 \pi t + \pi) d...
...i t )dt = - \frac{1}{2}\int_0^1 \cos(4 \pi t)dt - \frac{1}{2} = - \frac{1}{2},
$

as it should be according to the equations (2), (3) and (4).

The uncorrelatedness assumption holds for Gaussian noises – and furthermore – we can often consider that all distinct signals in the same communication system are essentially uncorrelated.

References: