How OWE architectures encode contextual effects in artificial neural networks
Artificial neural networks (ANNs) are widely used for classification tasks where discriminant cues and also contextual parameters are proposed as ANN inputs. When the input space is too large to enable a robust, time limited learning, a classical solution consists in designing a set of ANNs for different context domains. We have proposed a new learning algorithm, the lateral contribution learning algorithm (LCLA), based on the backpropagation learning algorithm, which allows for such a solution with a reduced learning time and more efficient performances thanks to lateral influences between networks. This attractive, but heavy solution has been improved thanks to the orthogonal weight estimator (OWE) technique, an original architectural technique which, under light constraints, merges the set of ANNs in one ANN whose weights are dynamically estimated, for each example, by others ANNs, fed by the context. This architecture allows to give a very rich and interesting interpretation of the weight landscape. We illustrate this interpretation with two examples: a mathematical function estimation and a process modelization used in neurocontrol.
Year of publication: |
1996
|
---|---|
Authors: | Pican, Nicolas ; Alexandre, Frédéric |
Published in: |
Mathematics and Computers in Simulation (MATCOM). - Elsevier, ISSN 0378-4754. - Vol. 41.1996, 1, p. 63-74
|
Publisher: |
Elsevier |
Saved in:
Saved in favorites
Similar items by person
-
Manager un projet : [s'organiser en amont, planifier les actions, faire participer l'équipe]
Alexandre, Frédéric, (2011)
- More ...