Hmh. Shalaby et A. Papamarcou, ERROR EXPONENTS FOR DISTRIBUTED DETECTION OF MARKOV SOURCES, IEEE transactions on information theory, 40(2), 1994, pp. 397-408
Citations number
14
Categorie Soggetti
Information Science & Library Science","Engineering, Eletrical & Electronic
The paper considers a binary hypothesis testing system in which two se
nsors simultaneously observe a discrete-time finite-valued stationary
ergodic Markov source and transmit M-ary messages to a Neyman-Pearson
central detector. The size M of the message alphabet increases at most
subexponentially with the number of observations. The asymptotic beha
vior of the type II error rate is investigated as the number of observ
ations increases to infinity, and the associated error exponent is obt
ained under mild assumptions on the source distributions. This exponen
t is independent of the test level epsilon and the actual codebook siz
es M, is achieved by a universally optimal sequence of acceptance regi
ons, and is characterized by an infimum of informational divergence ra
te over a class of infinite-dimensional distributions. Important diffe
rences-due to the observations being Markov-between the asymptotically
optimal distributed tests and their nondistributed counterparts are h
ighlighted. The converse results require a blowing-up lemma for statio
nary ergodic Markov sources, which is also proven.