Learning and retrieval in attractor neural networks with noise
A recent study on noiseless learning and retrieval in attractor neural networks above saturation, by Griniasty and Gutfreund, is extended to take account for imperfect learning by means of a temperature β−1 = T. Violations of the constraint imposed on the local stabilities are taken into account by various cost functions. The distribution of local stabilities and the fraction of errors during the learning stage are analysed. The retrieval dynamics for the sparsely connected network is studied showing high retrieval overlap in reduced retrieval regions for finite T.