Main Article Content

Abstract

The Gamma and Log-Normal distributions are frequently used in reliability to analyze lifetime data. The two distributions overlap in many cases and make it difficult to choose the best one. The ratio of maximized likelihood (RML) has been extensively used in choosing between them. But the Kullback-Leibler information is a measure of uncertainty between two functions, hence in this paper, we examine the use of Kullback-Leibler Divergence (KLD) in discriminating either the Gamma or Log-Normal distribution. Therefore, the ration of minimized Kullback-Leibler Divergence (RMKLD) test statistic is introduced and its applicability will be explained by two real data sets. Although the consistency of the new test statistic with RML is convinced, but the KLD has higher probability of correct selection when the null hypothesis is Gamma.

Keywords

Gamma Distribution Kullback-Leibler Divergence Log-Normal Distribution Model Discrimination Probability of Correct Selection Ratio of Maximized Likelihood

Article Details

How to Cite
Bromideh, A. A., & Valizadeh, R. (2014). Discrimination between Gamma and Log-Normal Distributions by Ratio of Minimized Kullback-Leibler Divergence. Pakistan Journal of Statistics and Operation Research, 9(4), 443-453. https://doi.org/10.18187/pjsor.v9i4.487