Recurrent neural networks (RNNs) with linearized dynamics have shown great promise in solving continuous valued optimization problems subject to bound constraints. Building on this progress, a novel method of hierarchical multi-scale optimization is developed that applies to a wide range of optimization problems and signal decomposition tasks. Central to the underlying concept is the definition of adiabatic layering. Analytic justification of this model can be regarded as a natural development of the mean-field theory. What emerges is an alternative hierarchical optimization method that promises to improve upon existing hierarchical schemes in combining the accuracy of global optimization with the compact representation of hierarchical optimization. Whereas conventional hierarchical optimization techniques typically tend to average over fine-scale details, such behavior is avoided by the modified dynamics of the proposed method. Applied to the signal decomposition problem of RBF approximation, the behavior of the adiabatic layering model is shown to be in close correspondence with the theoretical expectations.