dc.contributor.advisor | Kleppe, Tore Selland | |
dc.contributor.author | Slettevold, Simon | |
dc.date.accessioned | 2024-06-21T15:51:26Z | |
dc.date.available | 2024-06-21T15:51:26Z | |
dc.date.issued | 2024 | |
dc.identifier | no.uis:inspera:231510674:233466660 | |
dc.identifier.uri | https://hdl.handle.net/11250/3135369 | |
dc.description.abstract | This thesis explore algorithms to optimize the training process of a neural network for deep learning purposes, where the focus will be on gradient descent algorithms such as SGD ("Stochastic Gradient Descent") and ADAM (”Adaptive Moment Estimation”). The fundamental theory regarding neural networks and its mathematics will be thoroughly explained, and also put into practise through practical examples. Using these examples, a comparison will be made between different gradient descent schemes and algorithms. Through these comparisons, the ADAM algorithm shows most favorable. | |
dc.description.abstract | This thesis explore algorithms to optimize the training process of a neural network for deep learning purposes, where the focus will be on gradient descent algorithms such as SGD ("Stochastic Gradient Descent") and ADAM (”Adaptive Moment Estimation”). The fundamental theory regarding neural networks and its mathematics will be thoroughly explained, and also put into practise through practical examples. Using these examples, a comparison will be made between different gradient descent schemes and algorithms. Through these comparisons, the ADAM algorithm shows most favorable. | |
dc.language | eng | |
dc.publisher | UIS | |
dc.title | Gradient Descent Methods for Training Neural Networks | |
dc.type | Bachelor thesis | |