Reducing the Number of Neurons in Radial Basis Functions with Dynamic Decay Adjustment
Classification is a common task for supervised neural networks. A specific radial basis function network for classification is the so-called RBF network with dynamic decay adjustment (RBFN-DDA). Fast training and good classification performance are properties of this network. RBFN-DDA is a dynamically growing network, i.e. neurons are inserted during training. A drawback of RBFN-DDA is its greedy insertion behavior. Too many superfluous neurons are inserted for noisy data, overlapping data or for outliers. We propose an online technique to reduce the number of neurons during training. We achieve our goal by deleting neurons after each training of one epoche. By using our improved algorithm we can reduce the number of neurons noticeable (up to 88.2% less neurons), and we achieve a network with less complexity compared to original RBFN-DDA