Display options
Share it on

IEEE Trans Neural Netw Learn Syst. 2017 Sep;28(9):2022-2034. doi: 10.1109/TNNLS.2016.2572310. Epub 2016 Jun 08.

An Approach to Stable Gradient-Descent Adaptation of Higher Order Neural Units.

IEEE transactions on neural networks and learning systems

Ivo Bukovsky, Noriyasu Homma

PMID: 27295693 DOI: 10.1109/TNNLS.2016.2572310

Abstract

Stability evaluation of a weight-update system of higher order neural units (HONUs) with polynomial aggregation of neural inputs (also known as classes of polynomial neural networks) for adaptation of both feedforward and recurrent HONUs by a gradient descent method is introduced. An essential core of the approach is based on the spectral radius of a weight-update system, and it allows stability monitoring and its maintenance at every adaptation step individually. Assuring the stability of the weight-update system (at every single adaptation step) naturally results in the adaptation stability of the whole neural architecture that adapts to the target data. As an aside, the used approach highlights the fact that the weight optimization of HONU is a linear problem, so the proposed approach can be generally extended to any neural architecture that is linear in its adaptable parameters.

Publication Types