New Scaled Conjugate Gradient Algorithm for Training Artificial Neural Networks Based on Pure Conjugacy Condition

Abstract

Conjugate gradient methods constitute excellent neural network training methods characterized by their simplicity efficiency and their very low memory requirements. In this paper, we propose a new scaled conjugate gradient neural network training algorithm which guarantees descent property with standard Wolfe condition. Encouraging numerical experiments verify that the proposed algorithm provides fast and stable convergence.