Showing 1 - 5 of 5
This paper considers image classification based on a Markov random field (MRF), where the random field proposed here adopts Jeffreys divergence between category-specific probability densities. The classification method based on the proposed MRF is shown to be an extension of Switzer's soothing...
Persistent link: https://www.econbiz.de/10005021352
Kullback-Leibler divergence and the Neyman-Pearson lemma are two fundamental concepts in statistics. Both are about likelihood ratios: Kullback-Leibler divergence is the expected log-likelihood ratio, and the Neyman-Pearson lemma is about error rates of likelihood ratio tests. Exploring this...
Persistent link: https://www.econbiz.de/10005199566
This paper is concerned with a study of robust estimation in principal component analysis. A class of robust estimators which are characterized as eigenvectors of weighted sample covariance matrices is proposed, where the weight functions recursively depend on the eigenvectors themselves. Also,...
Persistent link: https://www.econbiz.de/10005199601
Let S be a p - p random matrix having a Wishart distribution Wp(n,n-1[Sigma]). For testing a general covariance structure [Sigma] = [Sigma]([xi]), we consider a class of test statistics Th = n inf [varrho]h(S, [Sigma]([xi])), where [varrho]h([Sigma]1, [Sigma]2) = [Sigma]j = 1ph([lambda]j) is a...
Persistent link: https://www.econbiz.de/10005199699
In this paper we consider robust parameter estimation based on a certain cross entropy and divergence. The robust estimate is defined as the minimizer of the empirically estimated cross entropy. It is shown that the robust estimate can be regarded as a kind of projection from the viewpoint of a...
Persistent link: https://www.econbiz.de/10005199754