Optimal simulation lengths for various algorithms computing the mean
The importance of truncation errors made by the computer is emphasized. A precise calculation of the bias for various algorithms of means computation leads us to the notion of optimal precision and optimal simulation length. It is proved that it is wrong to assume that the precision improves when the simulation length is increasing. The choice of a convenient algorithm and the choice of the simulation length appear to be most important.