## July – summing the signal powers

A simple but important observation for the July.

Let us consider two sinusoidal signals of opposite phases during the time : and Obviously, the signals cancel each other completely and the mean power of the additive signal is just 0:

 (2)

I we were a bit silly, we could try to add the powers of the signals separately together:

 (3)

but the result greatly differs from the derivation in ! Still, e.g. in point process models of wireless networks, the signal powers are often summed as in (3). The often unmentioned assumption behind this has everything to do with the independence of the signals:

let us assume that and  are some random variables of mean 0 (or deterministic variables, as a deterministic variable just is a random variable with a degenerate distribution). We have that

 (4)

thus the identity holds if and only if the inner-product, or the cross-correlation, – that is – if the random variables (or signals) and are not correlated. Clearly, this is not the case with our initial signals, as then the cross-correlation (with time shift ) is given by

as it should be according to the equations (2), (3) and (4).

The uncorrelatedness assumption holds for Gaussian noises – and furthermore – we can often consider that all distinct signals in the same communication system are essentially uncorrelated.

References: