Publication Details

Neural Networks

Contribution To Journal


Recurrent neural networks (RNNs) with linearized dynamics have shown great promise in solving continuous valued optimization problems subject to bound constraints. Building on this progress, a novel method of constrained hierarchical multi-scale optimization is developed that applies to a wide range of optimization problems and signal decomposition tasks. Central to the underlying concept is the definition of adiabatic layering. Analytic justification of this model can be regarded as a natural development of the mean-field theory. What emerges is an alternative hierarchical optimization method that promises to improve upon existing hierarchical schemes in combining the accuracy of global optimization with the compact representation of hierarchical optimization. Whereas conventional hierarchical optimization techniques typically tend to average over fine-scale detail when applied to bound-constrained problems, such behavior is avoided by the modified dynamics of the proposed method. Applied to the signal decomposition problem of RBF approximation, the behaviour of the adiabatic layering model is shown to be in close correspondence with the theoretical expectations.

DOI sciencedirect