Let us consider two sinusoidal signals of opposite phases during the time
:
and
. Obviously, the signals cancel each other completely, and the mean power of the additive signal is just 0:
|
|
(1) |
here
is the mean (slightly abusing the notation of the probabilistic expected value).
We could try to add the powers of the signals separately together:
but the result significantly differs from X! Can we sometimes sum the individual signal powers together, which would be handy in many applications? This question has everything to do with the correlation
of the signals.
Let us assume that and are two signals. We have that
|
(3) |
thus, the identity
holds if and only if the cross-correlation
i.e., if the signals and are not correlated—this is not the case with our initial signals, as then the cross-correlation is given by
as it should be if we tie together the equations (1), (2) and (3).
We used deterministic signals, but the same remarks apply to random signals. For example, for uniformly random phases
, let
and
;
The expected powers can be summed, as the expectation of the cross-correlation
is 0 for the two random signals and , i.e.,
.
References: