Thinking outside the box: Statistical inference based on Kullback-Leibler empirical projections
Suppose that X is a random vector with probability distribution P and suppose that denotes a proposed model that involves interesting parameters and relationship between variables. We consider statistical inference procedures for the case where constructed as follows: let [theta](P) denote the parameter of the distribution that minimizes a Kullback-Leibler (K-L)-type discrepancy K(Q,P) between Q and P. We take [theta](P) to be the parameter of interest. The estimate of [theta](P), when it exists, is defined by where is the empirical probability. We call a Kullback-Leibler empirical projection (KLEP). When does not exist, we extend the concept of a K-L discrepancy to limits of empirical likelihoods to obtain KLEP procedures. Properties of inference procedures based on are considered when . In particular we compare the naive procedure that uses the standard error applicable when , the sandwich formula standard error, and the bootstrap standard error using asymptotic methods and Monte Carlo simulation. For regression experiments with a model based on transforming both response and covariates, we use results of Hernandez and Johnson [1980. The large-sample behavior of transformations to normality. J. Amer. Statist. Assoc. 75, 855-861] to derive KLEP procedures.
Year of publication: |
2007
|
---|---|
Authors: | Doksum, Kjell ; Ozeki, Akichika ; Kim, Jihoon ; Chaibub Neto, Elias |
Published in: |
Statistics & Probability Letters. - Elsevier, ISSN 0167-7152. - Vol. 77.2007, 12, p. 1201-1213
|
Publisher: |
Elsevier |
Keywords: | KLEP Box-Cox transformation Outside the box K-L divergence Sandwich formula Bootstrap Classification Covariate transformations |
Saved in:
Saved in favorites
Similar items by person
-
Rank tests for the matched pair problem with life distributions
Doksum, Kjell, (1980)
-
Kim, Jihoon, (2022)
-
Phua, Joe, (2020)
- More ...