Showing 1 - 10 of 4,087
Many statistical and econometric learning methods rely on Bayesian ideas, often applied or reinterpreted in a frequentist setting. Two leading examples are shrinkage estimators and model averaging estimators, such as weighted-average least squares (WALS). In many instances, the accuracy of these...
Persistent link: https://www.econbiz.de/10012839923
Many statistical and econometric learning methods rely on Bayesian ideas, often applied or reinterpreted in a frequentist setting. Two leading examples are shrinkage estimators and model averaging estimators, such as weighted-average least squares (WALS). In many instances, the accuracy of these...
Persistent link: https://www.econbiz.de/10012176861
In small samples and especially in the case of small true default probabilities, standard approaches to credit default probability estimation have certain drawbacks. Most importantly, standard estimators tend to underestimate the true default probability which is of course an undesirable...
Persistent link: https://www.econbiz.de/10013113964
The estimation of the holding periods of financial products has to be done in a dynamic process in which the size of the observation time interval influences the result. Small intervals will produce smaller average holding periods than bigger ones. The approach developed in this paper offers the...
Persistent link: https://www.econbiz.de/10011890392
Probabilistic editing has been introduced to enable valid inference using established survey sampling theory in situations when some of the collected data points may have measurement errors and are therefore submitted to an editing process. To reduce the editing effort and avoid over-editing, in...
Persistent link: https://www.econbiz.de/10015207175
We extend to score, Wald and difference test statistics the scaled and adjusted corrections to goodness-of-fit test statistics developed in Satorra and Bentler (1988a,b). The theory is framed in the general context of multisample analysis of moment structures, under general conditions on the...
Persistent link: https://www.econbiz.de/10014179647
We consider Particle Gibbs (PG) as a tool for Bayesian analysis of non-linear non-Gaussian state-space models. PG is a Monte Carlo (MC) approximation of the standard Gibbs procedure which uses sequential MC (SMC) importance sampling inside the Gibbs procedure to update the latent and potentially...
Persistent link: https://www.econbiz.de/10012970355
Markov chain Monte Carlo (MCMC) methods have an important role in solving high dimensionality stochastic problems characterized by computational complexity. Given their critical importance, there is need for network and security risk management research to relate the MCMC quantitative...
Persistent link: https://www.econbiz.de/10013029835
This chapter presents a unified set of estimation methods for fitting a rich array of models describing dynamic relationships within a longitudinal data setting. The discussion surveys approaches for characterizing the micro dynamics of continuous dependent variables both over time and across...
Persistent link: https://www.econbiz.de/10014024953
A multiplier bootstrap procedure for construction of likelihood-based confidence sets is considered for finite samples and a possible model misspecification. Theoretical results justify the bootstrap consistency for a small or moderate sample size and allow to control the impact of the parameter...
Persistent link: https://www.econbiz.de/10010436527