Vis enkel innførsel

dc.contributor.advisorChakravorty, Antorweep
dc.contributor.advisorAgrawal, Bikash
dc.contributor.authorAtamanczuk, Bruna
dc.contributor.authorKaradas, Kurt Arve Skipenes
dc.date.accessioned2023-09-17T15:51:13Z
dc.date.available2023-09-17T15:51:13Z
dc.date.issued2023
dc.identifierno.uis:inspera:129729955:59494566
dc.identifier.urihttps://hdl.handle.net/11250/3089931
dc.description.abstractContinuous learning plays a crucial role in advancing the field of machine learning by addressing the challenges posed by evolving data and complex learning tasks. This thesis presents a novel approach to address the challenges of continuous learning. Inspired by evolutionary strategies, the approach introduces perturbations to the weights and biases of a neural network while leveraging backpropagation. The method demonstrates stable or improved accuracy for the 16 scenarios investigated without catastrophic forgetting. The experiments were conducted on three benchmark datasets, MNIST, Fashion-MNIST and CIFAR-10. Furthermore, different deep learning models were used to evaluate the approach, such as MLP and CNN. The data was split considering stratified and non-stratified sampling and with and without missing classes. The approach adapts to new classes without compromising performance and offers scalability in real-world scenarios. Overall, it shows promise in maintaining accuracy and adapting to changing data conditions while retaining knowledge from previous tasks.
dc.description.abstract
dc.languageeng
dc.publisheruis
dc.titleEvolving Deep Neural Networks for Continuous Learning: Addressing Challenges and Adapting to Changing Data Conditions without Catastrophic Forgetting
dc.typeMaster thesis


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel