dc.contributor.advisor | Chakravorty, Antorweep | |
dc.contributor.advisor | Agrawal, Bikash | |
dc.contributor.author | Atamanczuk, Bruna | |
dc.contributor.author | Karadas, Kurt Arve Skipenes | |
dc.date.accessioned | 2023-09-17T15:51:13Z | |
dc.date.available | 2023-09-17T15:51:13Z | |
dc.date.issued | 2023 | |
dc.identifier | no.uis:inspera:129729955:59494566 | |
dc.identifier.uri | https://hdl.handle.net/11250/3089931 | |
dc.description.abstract | Continuous learning plays a crucial role in advancing the field of machine learning by addressing the challenges posed by evolving data and complex learning tasks. This thesis presents a novel approach to address the challenges of continuous learning. Inspired by evolutionary strategies, the approach introduces perturbations to the weights and biases of a neural network while leveraging backpropagation. The method demonstrates stable or improved accuracy for the 16 scenarios investigated without catastrophic forgetting. The experiments were conducted on three benchmark datasets, MNIST, Fashion-MNIST and CIFAR-10. Furthermore, different deep learning models were used to evaluate the approach, such as MLP and CNN. The data was split considering stratified and non-stratified sampling and with and without missing classes. The approach adapts to new classes without compromising performance and offers scalability in real-world scenarios. Overall, it shows promise in maintaining accuracy and adapting to changing data conditions while retaining knowledge from previous tasks. | |
dc.description.abstract | | |
dc.language | eng | |
dc.publisher | uis | |
dc.title | Evolving Deep Neural Networks for Continuous Learning: Addressing Challenges
and Adapting to Changing Data Conditions without Catastrophic Forgetting | |
dc.type | Master thesis | |