Showing 1 - 10 of 472
We consider adaptive sequential lossy coding of bounded individual sequences when the performance is measured by the sequentially accumulated mean squared distortion. The encoder and the decoder are connected via a noiseless channel of capacity $R$ and both are assumed to have zero delay. No...
Persistent link: https://www.econbiz.de/10005772112
We derive a new inequality for uniform deviations of averages from their means. The inequality is a common generalization of previous results of Vapnik and Chervonenkis (1974) and Pollard (1986). Using the new inequality we obtain tight bounds for empirical loss minimization learning.
Persistent link: https://www.econbiz.de/10005827454
Persistent link: https://www.econbiz.de/10005371019
Persistent link: https://www.econbiz.de/10005655242
Persistent link: https://www.econbiz.de/10005759584
We derive a new inequality for uniform deviations of averages from their means. The inequality is a common generalization of previous results of Vapnik and Chervonenkis [1974, Theory of Pattern Recognition. Nauka, Moscow] and Pollard [1995, Uniform ratio limit theorems for empirical processes,...
Persistent link: https://www.econbiz.de/10005138169
Given an i.i.d. sample drawn from a density "f" on the real line, the problem of testing whether "f" is in a given class of densities is considered. Testing procedures constructed on the basis of minimizing the "L"<sub>1</sub>-distance between a kernel density estimate and any density in the hypothesized...
Persistent link: https://www.econbiz.de/10005285140
Persistent link: https://www.econbiz.de/10005184352
Persistent link: https://www.econbiz.de/10008674096
We obtain Vapnik-Chervonenkis type upper bounds for the uniform deviation of probabilities from their expectations. The bounds sharpen previously known probability inequalities.
Persistent link: https://www.econbiz.de/10005319809