Feature selection is a task of great importance. Many feature selection methods have been proposed, and can be divided generally into two groups based on their dependence on the learning algorithm/classifier. Recently, a feature selection method that selects features at the same time as it evolves neural networks that use those features as inputs called Feature Selective NeuroEvolution of Augmenting Topologies (FS-NEAT) was proposed by Whiteson et al. In this paper, a novel feature selection method called Feature Deselective NeuroEvolution of Augmenting Topologies (FD-NEAT) is presented. FD-NEAT begins with fully connected inputs in its networks, and drops irrelevant or redundant inputs as evolution progresses. Herein, the performances of FD-NEAT, FS-NEAT and traditional NEAT are compared in some mathematical problems, and in a challenging race car simulator domain (RARS). On the whole, the results show that FD-NEAT significantly outperforms FS-NEAT in terms of network performance and feature selection, and evolves networks that offer the best compromise between network size and performance.
TAN, MYL, Hartley, M, Bister, M & Deklerck, R 2009, 'Automated Feature Selection in Neuroevolution', Evolutionary Intelligence, vol. 1, pp. 271-292.
TAN, M. Y. L., Hartley, M., Bister, M., & Deklerck, R. (2009). Automated Feature Selection in Neuroevolution. Evolutionary Intelligence, 1, 271-292.
@article{6bbe32ce7a6948d699d95a3e372f0785,
title = "Automated Feature Selection in Neuroevolution",
abstract = "Feature selection is a task of great importance. Many feature selection methods have been proposed, and can be divided generally into two groups based on their dependence on the learning algorithm/classifier. Recently, a feature selection method that selects features at the same time as it evolves neural networks that use those features as inputs called Feature Selective NeuroEvolution of Augmenting Topologies (FS-NEAT) was proposed by Whiteson et al. In this paper, a novel feature selection method called Feature Deselective NeuroEvolution of Augmenting Topologies (FD-NEAT) is presented. FD-NEAT begins with fully connected inputs in its networks, and drops irrelevant or redundant inputs as evolution progresses. Herein, the performances of FD-NEAT, FS-NEAT and traditional NEAT are compared in some mathematical problems, and in a challenging race car simulator domain (RARS). On the whole, the results show that FD-NEAT significantly outperforms FS-NEAT in terms of network performance and feature selection, and evolves networks that offer the best compromise between network size and performance.",
keywords = "Neural networks, Genetic algorithms, Evolution, Learning",
author = "TAN, {Maxine Yen Ling} and Michael Hartley and Michel Bister and Rudi Deklerck",
year = "2009",
month = feb,
language = "English",
volume = "1",
pages = "271--292",
journal = "Evolutionary Intelligence",
issn = "1864-5917",
publisher = "Springer Verlag",
}