Showing 1 - 2 of 2
The Jensen–Shannon divergence is a symmetrized and smoothed version of the Kullback–Leibler divergence. Recently it has been widely applied to the analysis and characterization of symbolic sequences. In this paper we investigate a generalization of the Jensen–Shannon divergence. This...
Persistent link: https://www.econbiz.de/10010591364
We study nonextensive statistical scenarios à la Tsallis with reference to Fisher’s information and Rènyi’s entropy. A new way of evaluating Tsallis’ generalized expectation values is examined within such a context, and is shown to lead to a much better Cramer–Rao bound than the...
Persistent link: https://www.econbiz.de/10011060258