Showing 1 - 10 of 30
The classical perceptron algorithm is an elementary row-action/relaxation algorithm for solving a homogeneous linear inequality system Ax 0. A natural condition measure associated with this algorithm is the Euclidean width T of the cone of feasible solutions, and the iteration complexity of the...
Persistent link: https://www.econbiz.de/10005750571
The goal of this paper is to develop some computational experience and test the practical relevance of the theory of condition numbers C(d) for linear optimization, as applied to problem instances that one might encounter in practice. We used the NETLIB suite of linear optimization problems as a...
Persistent link: https://www.econbiz.de/10005574545
This paper presents a model and an analysis of the cost-flexibility tradeoffs involved in investing in product-flexible manufacturing capacity. Flexible capacity provides a firm with the ability to respond to a wide variety of future demand outcomes, but at the expense of the increased cost of...
Persistent link: https://www.econbiz.de/10009208859
We propose a pivotal method for estimating high-dimensional sparse linear regression models, where the overall number of regressors p is large, possibly much larger than n, but only s regressors are significant. The method is a modification of the lasso, called the square-root lasso. The method...
Persistent link: https://www.econbiz.de/10010613179
Persistent link: https://www.econbiz.de/10010614094
<p><p>We develop results for the use of LASSO and Post-LASSO methods to form first-stage predictions and estimate optimal instruments in linear instrumental variables (IV) models with many instruments, p, that apply even when p is much larger than the sample size, n. We rigorously develop asymptotic...</p></p>
Persistent link: https://www.econbiz.de/10008694043
on the basis of their commonly known strength levels, and privately observed strengthshocks
Persistent link: https://www.econbiz.de/10011104964
We propose robust methods for inference about the effect of a treatment variable on a scalar outcome in the presence of very many regressors in a model with possibly non-Gaussian and heteroscedastic disturbances. We allow for the number of regressors to be larger than the sample size. To make...
Persistent link: https://www.econbiz.de/10011268065
In this paper, we study the large‐sample properties of the posterior‐based inference in the curved exponential family under increasing dimensions. The curved structure arises from the imposition of various restrictions on the model, such as moment restrictions, and plays a fundamental role...
Persistent link: https://www.econbiz.de/10011085159
We propose a self-tuning √ Lasso method that simultaneiously resolves three important practical problems in high-dimensional regression analysis, namely it handles the unknown scale, heteroscedasticity, and (drastic) non-Gaussianity of the noise. In addition, our analysis allows for badly...
Persistent link: https://www.econbiz.de/10010827513