New bounds for the empirical robust Kullback-Leibler divergence problem


Information Sciences, vol.637, 2023 (SCI-Expanded) identifier

  • Publication Type: Article / Article
  • Volume: 637
  • Publication Date: 2023
  • Doi Number: 10.1016/j.ins.2023.118972
  • Journal Name: Information Sciences
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, Aerospace Database, Applied Science & Technology Source, Business Source Elite, Business Source Premier, Communication Abstracts, Computer & Applied Sciences, INSPEC, Library, Information Science & Technology Abstracts (LISTA), Metadex, MLA - Modern Language Association Database, zbMATH, Civil Engineering Abstracts
  • Keywords: Composite, Kullback-Leibler divergence, Lévy ball, Robust, Universal hypothesis testing
  • Galatasaray University Affiliated: Yes


This paper deals with the bounds of the empirical robust Kullback-Leibler (KL) divergence problem that is proposed in the literature to be used for universal hypothesis testing (UHT). The original problem formulation relies on the bounds derived from the Lévy ball. New bounds are proposed, and they are shown to be more tight. A new parameter is also introduced to be used for modifications of the new and existing bounds. Then, a computational study is devised to evaluate the performance of the modified test in terms of power for fixed sample sizes. Based on the computational results, we can conclude that the new proposals are promising by increasing adaptability of the robust/composite hypothesis testing.