—We consider uncertainty classes of noise distributions defined by a bound on the divergence with respect to a nominal noise distribution. The noise that maximizes the minimum error probability for binary-input channels is found. The effect of the reduction in uncertainty brought about by knowledge of the signal-to-noise ratio is also studied. The particular class of Gaussian nominal distributions provides an analysis tool for nearGaussian channels. Asymptotic behavior of the least favorable noise distribution and resulting error probability are studied in a variety of scenarios, namely: asymptotically small divergence with and without power constraint; asymptotically large divergence with and without power constraint; and asymptotically large signal-to-noise ratio.
Andrew L. McKellips, Sergio Verdú