A simple but important observation for the July.
Let us consider two sinusoidal signals of opposite phases during the time
:
and
Obviously, the signals cancel each other completely and the mean power of the additive signal
is just 0:
|
![$\displaystyle \mathbb{E}[(S_1 + S_2)^2] = \int_0^1 (\cos(2 \pi t) + \cos(2 \pi t + \pi))^2dt = \int_0^10 dt = 0,$](img79.svg) |
(1) |
where
is the mean (slightly abusing the notation of the probabilistic expected value).
We could try to add the powers of the signals separately together:
but the result greatly differs from
! Can we then sometimes sum the individual signal powers together or not? This surely would be handy. This question has everything to do with the correlation
of the signals.
Let us assume that
and
are two signals. We have that
![$\displaystyle \mathbb{E}[(S_1 + S_2)^2] = \mathbb{E}[S_1^2 + S_2^2 + 2 S_1 S_2]= \mathbb{E}[S_1^2] + \mathbb{E}[S_2^2] + 2 \mathbb{E}[S_1 S_2],$](img87.svg) |
(3) |
thus the identity
holds if and only if the cross-correlation
i.e. if the signals
and
are not correlated. Clearly, this is not the case with our initial signals, as then the cross-correlation is given by
as it should be according to the equations (1), (2) and (3).
In above, we used deterministic signals, but the same remarks apply to random signals. For example, for uniformly random phases
, let
and
;
The expected powers can be summed, as the expectation of the cross-correlation
is 0 for the two random signals
and
, i.e.,
.
References: