Show simple item record

dc.contributor.advisorKleppe, Tore Selland
dc.contributor.authorSlettevold, Simon
dc.date.accessioned2024-06-21T15:51:26Z
dc.date.available2024-06-21T15:51:26Z
dc.date.issued2024
dc.identifierno.uis:inspera:231510674:233466660
dc.identifier.urihttps://hdl.handle.net/11250/3135369
dc.description.abstractThis thesis explore algorithms to optimize the training process of a neural network for deep learning purposes, where the focus will be on gradient descent algorithms such as SGD ("Stochastic Gradient Descent") and ADAM (”Adaptive Moment Estimation”). The fundamental theory regarding neural networks and its mathematics will be thoroughly explained, and also put into practise through practical examples. Using these examples, a comparison will be made between different gradient descent schemes and algorithms. Through these comparisons, the ADAM algorithm shows most favorable.
dc.description.abstractThis thesis explore algorithms to optimize the training process of a neural network for deep learning purposes, where the focus will be on gradient descent algorithms such as SGD ("Stochastic Gradient Descent") and ADAM (”Adaptive Moment Estimation”). The fundamental theory regarding neural networks and its mathematics will be thoroughly explained, and also put into practise through practical examples. Using these examples, a comparison will be made between different gradient descent schemes and algorithms. Through these comparisons, the ADAM algorithm shows most favorable.
dc.languageeng
dc.publisherUIS
dc.titleGradient Descent Methods for Training Neural Networks
dc.typeBachelor thesis


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record