Highly resistant gradient descent algorithm for computing intrinsic mean shape on similarity shape spaces
Among many algorithms, gradient descent algorithm (GDA) is a simple tool to derive an optimal quantity in dealing with an optimization problem in the linear space. Apart from the initial value, the step size has a great impact on the convergence rate of this algorithm. Its affect on the geometric structure of the consecutive configurations is more crucial if one works with an optimization problem in the statistical shape analysis. In other words, if the step size of the GDA is not properly tuned, the geometry might not be preserved while the algorithm is moving forward to reach an optimal mean shape. In order to improve the performance of the GDA, we introduce a dynamic step size and a new criterion both to check the geometry in each step of the algorithm and to accelerate the convergence rate. These lead to a new robust algorithm on deriving the intrinsic mean on the shape space. We compare the performance of our proposed procedure to the usual GDA using a real shape data accompanied with simulation studies. Copyright The Author(s) 2015
Year of publication: |
2015
|
---|---|
Authors: | Fotouhi, H. ; Golalizadeh, M. |
Published in: |
Statistical Papers. - Springer. - Vol. 56.2015, 2, p. 391-410
|
Publisher: |
Springer |
Subject: | Shape space | Non-Euclidean statistics | Intrinsic mean shape | Robust gradient descent algorithm | Step size |
Saved in:
Online Resource
Saved in favorites
Similar items by subject
-
Convergence of the majorization method for multidimensional scaling
Leeuw, Jan, (1988)
-
The Geometry of Shape Space: Application to Influenza
Lapedes, Alan, (2000)
-
Deriving Shape Space Parameters from Immunological Data for a Model of Cross-Reactive Memory
Smith, Derek J., (1997)
- More ...
Similar items by person