Worst additive noise under covariance constraints. Diggavi, S N. & T M. Cover, undefined IEEE Transactions on Information Theory, 47(7):3072–3081, November, 2001.
abstract   bibtex   
This paper started with a simple question: is the maximum entropy noise the worst noise for additive channels? In the case of scalar channels with a power constraint on the noise, this is true, as is well known. However, we show that in the vector case, with covariance constraints, the answer is yes and no. Yes, if the transmit power is large enough and no otherwise. Along the way we give a solution to the mutual information game with covariance constraints and show that Gaussian solutions form saddle points, but there could also be other saddlepoints. We also demonstrate that the information rates can be achieved using mismatched (Gaussian) decoders.
@article{DCj01,
 abstract = {This paper started with a simple question: is the maximum entropy noise the worst noise for additive channels? In the case of scalar channels with a power constraint
on the noise, this is true, as is well known. However, we show that in the vector case, with
covariance constraints, the answer is yes and no. Yes, if the transmit power is large enough
and no otherwise. Along the way we give a solution to the mutual information game with
covariance constraints and show that Gaussian solutions form saddle points, but there could
also be other saddlepoints. We also demonstrate that the information rates can be achieved
using mismatched (Gaussian) decoders.},
 author = {S N. Diggavi and T M. Cover,},
 file = {:papers:ps:worstnoise.pdf},
 journal = {IEEE Transactions on Information Theory},
 label = {dc_j01},
 month = {November},
 note = {},
 number = {7},
 pages = {3072--3081},
 tags = {journal,WorstNoise,RobComm,IT},
 title = {Worst additive noise under covariance constraints},
 type = {2},
 volume = {47},
 year = {2001}
}

Downloads: 0