Approximating and Simulating the Stochastic Growth Model: Parameterized Expectations, Neural Networks, and the Genetic Algorithm
This paper compares alternative methods for approximating and solving the stochastic growth model with parameterized expectations. We compare polynomial and neural netowork specifications for expectations, and we employ both genetic algorithm and gradient-descent methods for solving the alternative models of parameterized expectations. Many of the statistics generated by the neural network specification in combination with the genetic algorithm and gradient descent optimization methods approach the statistics generated by the exact solution with risk aversion coefficients close to unity and full depreciation of the capital stock. For the alternative specification, with no depreciation of capital, the neural network results approach those generated by computationally-intense methods. Our results suggest that the neural network specification and genetic algorithm solution methods should at least complement parameterized expectation solutions based on polynomial approximation and pure gradient-descent optimization.