Showing 1 - 10 of 146
Purpose –The purpose of this paper is to examine the accounting education systems in three countries – Australia, Japan and Sri Lanka – to inform the development and testing (by application) of a Global Model of Accounting Education. Design/methodology/approach –An action research...
Persistent link: https://www.econbiz.de/10010941821
Persistent link: https://www.econbiz.de/10010438506
Persistent link: https://www.econbiz.de/10005603571
Lipschitz continuity of the gradient mapping of a continuously differentiable function plays a crucial role in designing various optimization algorithms. However, many functions arising in practical applications such as low rank matrix factorization or deep neural network problems do not have a...
Persistent link: https://www.econbiz.de/10014502010
Conjugate gradient methods are efficient to minimize differentiable objective functions in large dimension spaces. Recently, Dai and Yuan introduced a tree-parameter family of nonlinear conjugate gradient methods and show their convergence. However, line search strategies usually bring...
Persistent link: https://www.econbiz.de/10005050659
The shortest-residual family of conjugate gradient methods was first proposed by Hestenes and was studied by Pytlak, and Dai and Yuan. Recently, a no-line-search scheme in conjugate gradient methods was given by Sun and Zhang, and Chen and Sun. In this paper, we show the global convergence of...
Persistent link: https://www.econbiz.de/10005080681
Mutation operator is one of the mechanisms of evolutionary algorithms (EAs) and it can provide diversity in the search and help to explore the undiscovered search place. Quantum-behaved particle swarm optimization (QPSO), which is inspired by fundamental theory of PSO algorithm and quantum...
Persistent link: https://www.econbiz.de/10004980434
A bundle method for minimizing the difference of convex (DC) and possibly nonsmooth functions is developed. The method may be viewed as an inexact version of the DC algorithm, where each subproblem is solved only approximately by a bundle method. We always terminate the bundle method after the...
Persistent link: https://www.econbiz.de/10015209741
Gradient-based methods have been highly successful for solving a variety of both unconstrained and constrained nonlinear optimization problems. In real-world applications, such as optimal control or machine learning, the necessary function and derivative information may be corrupted by noise,...
Persistent link: https://www.econbiz.de/10015361665
A bundle method for minimizing the difference of convex (DC) and possibly nonsmooth functions is developed. The method may be viewed as an inexact version of the DC algorithm, where each subproblem is solved only approximately by a bundle method. We always terminate the bundle method after the...
Persistent link: https://www.econbiz.de/10015403302