The Impact of Learning rate on Backpropagation Algorithm in Matlab
DOI:
https://doi.org/10.22555/pjets.v11i2.1014Keywords:
Artificial Neural Network, Backpropagation Algorithm, Hidden Layer, SigmoidAbstract
Artificial Neural Networks (ANNs) are highly interconnected. Backpropagation is a common method for training artificial neural networks to minimize the objective function. This study describes the implementation of the backpropagation algorithm. The different errors generated at the output are fed back to the input, and the weights of the neurons are updated by different supervised learning rates, which is a generalization of the delta rule. A sigmoid function was used as the activation function. The design was simulated using MATLAB R2018a. The maximum accuracy was achieved 0.9988 with four hidden layers
References
Downloads
Published
Issue
Section
License
Copyright (c) 2023 Abdul Ghafoor Shaikh, Wajid Ali Shaikh

This work is licensed under a Creative Commons Attribution 4.0 International License.









