Showing 1 - 10 of 6,477
The computing time for Markov Chain Monte Carlo (MCMC) algorithms can be prohibitively large for datasets with many observations, especially when the data density for each observation is costly to evaluate. We propose a framework where the likelihood function is estimated from a random subset of...
Persistent link: https://www.econbiz.de/10010500806
Many statistical and econometric learning methods rely on Bayesian ideas, often applied or reinterpreted in a frequentist setting. Two leading examples are shrinkage estimators and model averaging estimators, such as weighted-average least squares (WALS). In many instances, the accuracy of these...
Persistent link: https://www.econbiz.de/10012839923
Many statistical and econometric learning methods rely on Bayesian ideas, often applied or reinterpreted in a frequentist setting. Two leading examples are shrinkage estimators and model averaging estimators, such as weighted-average least squares (WALS). In many instances, the accuracy of these...
Persistent link: https://www.econbiz.de/10012176861
In small samples and especially in the case of small true default probabilities, standard approaches to credit default probability estimation have certain drawbacks. Most importantly, standard estimators tend to underestimate the true default probability which is of course an undesirable...
Persistent link: https://www.econbiz.de/10013113964
We studied the effects of sample size and distribution scale/shape for 3 types of skewness (g1, G1, and b1) and kurtosis (g2, G2, and b2) using 18 simulated probability distributions. In general, skewness and kurtosis always increased with increasing sample size. The order in the skewness...
Persistent link: https://www.econbiz.de/10014242098
Probabilistic editing has been introduced to enable valid inference using established survey sampling theory in situations when some of the collected data points may have measurement errors and are therefore submitted to an editing process. To reduce the editing effort and avoid over-editing, in...
Persistent link: https://www.econbiz.de/10015207175
We propose a nonparametric Bayesian approach for conducting inference on probabilistic surveys. We use this approach to study whether U.S. Survey of Professional Forecasters density projections for output growth and inflation are consistent with the noisy rational expectations hypothesis. We...
Persistent link: https://www.econbiz.de/10014080529
We present an easily implemented, fast, and accurate method for approximating extreme quantiles of compound loss distributions (frequency and severity) as are commonly used in insurance and operational risk capital models. The Interpolated Single Loss Approximation (ISLA) of Opdyke (2014) is...
Persistent link: https://www.econbiz.de/10012967848
• The first ever explicit formulation of the concept of an option's probability density functions has been introduced in our publications "Breakthrough in Understanding Derivatives and Option Based Hedging - Marginal and Joint Probability Density Functions of Vanilla Options -- True...
Persistent link: https://www.econbiz.de/10013030477