Adıyaman Üniversitesi Kurumsal Arşivi

Comparison of the Stochastic Gradient Descent Based Optimization Techniques

Basit öğe kaydını göster

dc.contributor.author Yazan, Ersan
dc.contributor.author Talu, M. Fatih
dc.date.accessioned 2024-07-01T05:11:06Z
dc.date.available 2024-07-01T05:11:06Z
dc.date.issued 2017
dc.identifier.issn 0000-0003-1166-8404
dc.identifier.uri http://dspace.adiyaman.edu.tr:8080/xmlui/handle/20.500.12414/5266
dc.description.abstract The stochastic gradual descent method (SGD) is a popular optimization technique based on updating each theta(k) parameter in the partial derivative J(theta)/partial derivative theta(k) direction to minimize / maximize the (J theta) cost function. This technique is frequently used in current artificial learning methods such as convolutional learning and automatic encoders. In this study, five different approaches (Momentum, Adagrad, Adadelta, Rmsprop ve Adam) based on SDA used in updating the theta parameters were investigated. By selecting specific test functions, the advantages and disadvantages of each approach are compared with each other in terms of the number of oscillations, the parameter update rate and the minimum cost reached. The comparison results are shown graphically. tr
dc.language.iso en tr
dc.publisher IEEE tr
dc.title Comparison of the Stochastic Gradient Descent Based Optimization Techniques tr
dc.type Other tr
dc.contributor.authorID 0000-0003-1166-8404 tr
dc.contributor.department Adiyaman Univ, Besni Meslek Yuksekokulu Bilgisayar Teknol, Adiyaman, Turkey tr
dc.contributor.department Inonu Univ, Muhendisl Fak, Bilgisayar Muhendisligi, Malatya, Turkey tr
dc.source.title 2017 INTERNATIONAL ARTIFICIAL INTELLIGENCE AND DATA PROCESSING SYMPOSIUM (IDAP) tr


Bu öğenin dosyaları:

Bu öğe aşağıdaki koleksiyon(lar)da görünmektedir.

Basit öğe kaydını göster