July—Summing the signal powers

Let us consider two sinusoidal signals of opposite phases during the time $t \in [0,1]$: $S_1(t) = \cos(2 \pi t)$ and $S_2(t) = \cos(2 \pi t + \pi)$. Obviously, the signals cancel each other completely, and the mean power of the additive signal $S_1 + S_2$ is just 0:

  $\displaystyle \mathbb{E}[(S_1 + S_2)^2] = \int_0^1 (\cos(2 \pi t) + \cos(2 \pi t + \pi))^2dt = \int_0^10 dt = 0,$ (1)

here $\mathbb{E}[\cdot]$ is the mean (slightly abusing the notation of the probabilistic expected value).

We could try to add the powers of the signals separately together:

$\displaystyle \mathbb{E}[S_1^2] + \mathbb{E}[S_2^2]$ $\displaystyle = \int_0^1 \cos^2(2 \pi t) dt + \int_0^1 \cos^2(2 \pi t + \pi) dt$ (2)
  $\displaystyle = \int_0^1 2 \cos^2(2 \pi t) dt= \int_0^1 \cos(4 \pi t)dt + 1 = 1,$    

but the result significantly differs from X! Can we sometimes sum the individual signal powers together, which would be handy in many applications? This question has everything to do with the correlation of the signals.

Let us assume that $S_1$ and $S_2$ are two signals. We have that

$\displaystyle \mathbb{E}[(S_1 + S_2)^2] = \mathbb{E}[S_1^2 + S_2^2 + 2 S_1 S_2]= \mathbb{E}[S_1^2] + \mathbb{E}[S_2^2] + 2 \mathbb{E}[S_1 S_2],$ (3)

thus, the identity $\mathbb{E}[(S_1 + S_2)^2] = \mathbb{E}[S_1^2] + \mathbb{E}[S_2^2] $ holds if and only if the cross-correlation $\mathbb{E}[S_1 S_2] = 0,$ i.e., if the signals $S_1$ and $S_2$ are not correlated—this is not the case with our initial signals, as then the cross-correlation is given by

$\displaystyle \mathbb{E}[S_1S_2] = \int_0^1 \cos(2 \pi t) \cos(2 \pi t + \pi) d...
...i t )dt = - \frac{1}{2}\int_0^1 \cos(4 \pi t)dt - \frac{1}{2} = - \frac{1}{2},
$

as it should be if we tie together the equations (1), (2) and (3).

We used deterministic signals, but the same remarks apply to random signals. For example, for uniformly random phases $\phi_1,\phi_2 \in [0,2 \pi]$, let $S_1(t) = \cps(2 \pi t + \phi_1)$ and $S_2(t) = \cos(2 \pi t + \phi_2)$;

  $\displaystyle \mathbb{E}_{\phi_1,\phi_2} \left[\mathbb{E}[(S_1 + S_2)^2]\right]...
...\right] +\mathbb{E}_{\phi_1,\phi_2} \left[\mathbb{E}\left[ S_1S_2\right]\right]$    
  $\displaystyle =\frac{1}{2 \pi}\int_0^{2 \pi} \int_0^1 (\cos(2 \pi t + \phi_1))^...
...+ \frac{1}{2 \pi}\int_0^{2 \pi} \int_0^1 ( \cos(2 \pi t + \phi_2))^2dt d \phi_2$    
  $\displaystyle + \frac{1}{2 \pi}\int_0^{2 \pi} \int_0^{2 \pi}\int_0^1 \cos(2 \pi t + \phi_1) \cos(2 \pi t + \phi_2)dt d \phi_1 d \phi_2$    
  $\displaystyle = \frac{1}{2 \pi}\int_0^{2 \pi} \int_0^1 (\cos(2 \pi t + \phi_1))...
...i}\int_0^{2 \pi} \int_0^1 ( \cos(2 \pi t + \phi_2))^2dt d \phi_2 = 1/2 +1/2 =1.$    

The expected powers can be summed, as the expectation of the cross-correlation $\mathbb{E}[S_1 S_2]$ is 0 for the two random signals $S_1$ and $S_2$, i.e., $\mathbb{E}_{\phi_1,\phi_2} \left[\mathbb{E}\left[ S_1S_2\right]\right] =0$.

References: