A simple but important observation for the July.
Let us consider two sinusoidal signals of opposite phases during the time
:
and
Obviously, the signals cancel each other completely and the mean power of the additive signal is just 0:
|
|
(2) |
where
is the mean.
I we were a bit silly, we could try to add the powers of the signals separately together:
but the result greatly differs from the derivation in ! Still, e.g. in the point process models of wireless networks, the signal powers are often summed as in (3). The often unmentioned assumption behind this has everything to do with the correlation
of the signals.
Let us assume that and are two signals. We have that
|
(4) |
thus the identity
holds if and only if the cross-correlation
i.e. if the signals and are not correlated. Clearly, this is not the case with our initial signals, as then the cross-correlation is given by
as it should be according to the equations (2), (3) and (4).
We used deterministic signals, but the same remarks apply to random signals or general random variables.
References: