Statistical mechanics of neural networks: what are the differences between wide and narrow basins?
We consider training noise in neural networks as a means of tuning the structure of retrieval basins, and study how learning and retrieving properties depend on it. The stability of the replica symmetric solution and the correlation in the weight space indicate that neural networks can be roughly classified into Hebbian-like and MSN-like (MSN meaning the maximally stable network). Re-entrant retrieval, noise robustness, selectivity, damage spreading and activity distribution all illustrate the differences in retrieval behaviours arising from the different basin structures.
Year of publication: |
1992
|
---|---|
Authors: | Wong, K.Y.M. ; Sherrington, D. |
Published in: |
Physica A: Statistical Mechanics and its Applications. - Elsevier, ISSN 0378-4371. - Vol. 185.1992, 1, p. 453-460
|
Publisher: |
Elsevier |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
Adaptive optimization in neural networks
Wong, K.Y.M., (1992)
-
Novel phase diagrams of sequential learning in neural networks
Wong, K.Y.M., (1992)
-
Freezing transitions in neural networks
Wong, K.Y.M., (1994)
- More ...