Evolving Deep Neural Networks for Continuous Learning: Addressing Challenges and Adapting to Changing Data Conditions without Catastrophic Forgetting
Master thesis
Permanent lenke
https://hdl.handle.net/11250/3089931Utgivelsesdato
2023Metadata
Vis full innførselSamlinger
- Studentoppgaver (TN-IDE) [911]
Sammendrag
Continuous learning plays a crucial role in advancing the field of machine learning by addressing the challenges posed by evolving data and complex learning tasks. This thesis presents a novel approach to address the challenges of continuous learning. Inspired by evolutionary strategies, the approach introduces perturbations to the weights and biases of a neural network while leveraging backpropagation. The method demonstrates stable or improved accuracy for the 16 scenarios investigated without catastrophic forgetting. The experiments were conducted on three benchmark datasets, MNIST, Fashion-MNIST and CIFAR-10. Furthermore, different deep learning models were used to evaluate the approach, such as MLP and CNN. The data was split considering stratified and non-stratified sampling and with and without missing classes. The approach adapts to new classes without compromising performance and offers scalability in real-world scenarios. Overall, it shows promise in maintaining accuracy and adapting to changing data conditions while retaining knowledge from previous tasks.