Neuro Evolution Revisited: Blending Neural Architecture Search and Feature Selection ■
Neuro evolution is the domain of learning architectures of neural networks using genetic
algorithms: considering the quest for the optimal network topology as a search process driven by a
genetic algorithm. A decade ago, conceptual methods were established and impressive results were
reached. We innovated at the time by adding feature selection methods in the search process: neuro
evolution can simultaneously learn the relevant features and network topology. Jointly optimising
them leads to better results than doing it sequentially.
However, methods are very compute intensive. At the time, proper experiments on large problems
were not possible, as the compute power was lacking. Meanwhile, neuro evolution has been
implemented on gpus and the compute power has exploded. Time to revisit the old problems and
run experiments that took a month in less than one day and finally do the proper tests.
To implement FS-NEAT and FD-NEAT into a parallel gpu optimized version of NEAT to
evaluate the benchmark problems of a decade ago to document the proper implementation of FSNEAT
and FD-NEAT extensions to apply and compare these two approaches against standard
machine learning methods on real world datasets.
Framework of the Thesis ■
Literature Review (ETOC: 2 months): Familiarize with existing literature on NEAT,FS-NEAT
and FD-NEAT identify the best suited GPU version of NEAT and get the basic demos of
NEAT working on typical benchmark problems.
- Implement FS-NEAT and FD-NEAT in the selected NEAT version.
- Comparative analysis of the three algorithms on challenging real world problems.
Expected Student Profile ■
Following an MSc in a field related to one or more of the following: Computer
Science, Biomedical Engineering, Applied Computer Science - Digital Health.
Strong programming skills (Python).
Ability to write scientific reports and communicate research results at
conferences in English.