TY - GEN

T1 - Minimax Optimal Sequential Tests for Multiple Hypotheses

AU - Faus, Michael

AU - Zoubir, Abdelhak M.

AU - Poor, H. Vincent

N1 - Funding Information:
This work was supported in part by the U.S. National Science Foundation under Grant CNS-1702808.
Publisher Copyright:
© 2018 IEEE.
Copyright:
Copyright 2019 Elsevier B.V., All rights reserved.

PY - 2019/2/5

Y1 - 2019/2/5

N2 - Statistical hypothesis tests are referred to as robust if they are insensitive to small, random deviations from the underlying model. For two hypotheses and fixed sample sizes, the robust testing is well studied and understood. However, few results exist for the case in which the number of samples is variable (i.e., sequential testing) and the number of hypotheses is larger than two (i.e., multiple hypothesis testing). This paper outlines a theory of minimax optimal sequential tests for multiple hypotheses under general distributional uncertainty. It is shown that, in analogy to the fixed sample size case, the minimax solution is an optimal test for the least favorable distributions, i.e., a test that optimally separates the most similar feasible distributions. The joint similarity of multiple distributions is shown to be determined by a weighted f-dissimilarity, whose corresponding function is given by the unique solution of a nonlinear integral equation and whose weights are given by the likelihood ratios of the past samples. As a consequence, the least favorable distributions depend on the past observations and the underlying random process becomes a Markov-process whose state variable coincides with the test statistic.

AB - Statistical hypothesis tests are referred to as robust if they are insensitive to small, random deviations from the underlying model. For two hypotheses and fixed sample sizes, the robust testing is well studied and understood. However, few results exist for the case in which the number of samples is variable (i.e., sequential testing) and the number of hypotheses is larger than two (i.e., multiple hypothesis testing). This paper outlines a theory of minimax optimal sequential tests for multiple hypotheses under general distributional uncertainty. It is shown that, in analogy to the fixed sample size case, the minimax solution is an optimal test for the least favorable distributions, i.e., a test that optimally separates the most similar feasible distributions. The joint similarity of multiple distributions is shown to be determined by a weighted f-dissimilarity, whose corresponding function is given by the unique solution of a nonlinear integral equation and whose weights are given by the likelihood ratios of the past samples. As a consequence, the least favorable distributions depend on the past observations and the underlying random process becomes a Markov-process whose state variable coincides with the test statistic.

KW - minimax procedures

KW - multiple hypothesis testing

KW - ro-bust hypothesis testing

KW - sequential analysis

UR - http://www.scopus.com/inward/record.url?scp=85062892496&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85062892496&partnerID=8YFLogxK

U2 - 10.1109/ALLERTON.2018.8635956

DO - 10.1109/ALLERTON.2018.8635956

M3 - Conference contribution

AN - SCOPUS:85062892496

T3 - 2018 56th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2018

SP - 1044

EP - 1046

BT - 2018 56th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2018

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 56th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2018

Y2 - 2 October 2018 through 5 October 2018

ER -