Skip to content

Russolves/Coding-step-size-optimizers

Repository files navigation

Coding-step-size-optimizers

Manually hand-coding step-size optimizers (momentum and Adam) for deep learning neural networks. This was achieved through further modification of the logic used in stochastic gradient descent.

Project Description

Project contains a ComputationalGraphPrimer file that contains all the classes required for training model through stochastic gradient descent (SGD). The 2 other files for single-neuron classifier and multi-neuron classifier contain classes with functions overwritten in order to achieve stochastic gradient descent with momentum and Adam.

About

Manually hand-coding in step-size optimizers (momentum and Adam) for deep learning neural networks

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages