New bounds for the empirical robust Kullback-Leibler divergence problem


BAHÇECİ U.

Information Sciences, cilt.637, 2023 (SCI-Expanded) identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 637
  • Basım Tarihi: 2023
  • Doi Numarası: 10.1016/j.ins.2023.118972
  • Dergi Adı: Information Sciences
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, Aerospace Database, Applied Science & Technology Source, Business Source Elite, Business Source Premier, Communication Abstracts, Computer & Applied Sciences, INSPEC, Library, Information Science & Technology Abstracts (LISTA), Metadex, MLA - Modern Language Association Database, zbMATH, Civil Engineering Abstracts
  • Anahtar Kelimeler: Composite, Kullback-Leibler divergence, Lévy ball, Robust, Universal hypothesis testing
  • Galatasaray Üniversitesi Adresli: Evet

Özet

This paper deals with the bounds of the empirical robust Kullback-Leibler (KL) divergence problem that is proposed in the literature to be used for universal hypothesis testing (UHT). The original problem formulation relies on the bounds derived from the Lévy ball. New bounds are proposed, and they are shown to be more tight. A new parameter is also introduced to be used for modifications of the new and existing bounds. Then, a computational study is devised to evaluate the performance of the modified test in terms of power for fixed sample sizes. Based on the computational results, we can conclude that the new proposals are promising by increasing adaptability of the robust/composite hypothesis testing.