Showing 1 - 10 of 17
Persistent link: https://www.econbiz.de/10010948513
Monte Carlo importance sampling for evaluating numerical integration is discussed. We consider a parametric family of sampling distributions and propose the use of the sampling distribution estimated by maximum likelihood. The proposed method of importance sampling using the estimated sampling...
Persistent link: https://www.econbiz.de/10005569431
Persistent link: https://www.econbiz.de/10010947860
Persistent link: https://www.econbiz.de/10005375759
Persistent link: https://www.econbiz.de/10010948550
This paper considers image classification based on a Markov random field (MRF), where the random field proposed here adopts Jeffreys divergence between category-specific probability densities. The classification method based on the proposed MRF is shown to be an extension of Switzer's soothing...
Persistent link: https://www.econbiz.de/10005021352
In two-group discriminant analysis, the Neyman--Pearson Lemma establishes that the ROC, receiver operating characteristic, curve for an arbitrary linear function is everywhere below the ROC curve for the true likelihood ratio. The weighted area between these two curves can be used as a risk...
Persistent link: https://www.econbiz.de/10005447031
Problems of the analysis of data with incomplete observations are all too familiar in statistics. They are doubly difficult if we are also uncertain about the choice of model. We propose a general formulation for the discussion of such problems and develop approximations to the resulting bias of...
Persistent link: https://www.econbiz.de/10005294615
Persistent link: https://www.econbiz.de/10005184636
Kullback-Leibler divergence and the Neyman-Pearson lemma are two fundamental concepts in statistics. Both are about likelihood ratios: Kullback-Leibler divergence is the expected log-likelihood ratio, and the Neyman-Pearson lemma is about error rates of likelihood ratio tests. Exploring this...
Persistent link: https://www.econbiz.de/10005199566