This program demonstrates how the Gradient Descent algorithm works in linear regression.
Given a set of points, linear regression is finding the best-fit line for predicting similar points.
Gradient Descent algorithm recursively changes parameters of the line equation to obtain the minimum error.
Problem is summarized in the following formulas.
The algorithm decreases thetas by alpha times partial derivatives of the error function with respect to individual thetas, where alpha is the learning rate.
In this program, alpha is decaying rather than fixed.
Initially, gradient descent has a lot of errors and error function parabola roots are far away from thetas.
As the algorithm proceeds, the error gets minimized and optimal thetas are found.
In settings.py
, generate a 50-digit key for SECRET_KEY
using key generators. If the host is not allowed, add the local host IP address to ALLOWED_HOSTS
or set DEBUG = True.
Open a terminal in the project root. Start the server with the following command:
python manage.py runserver
After making attribute changes in the model, run the following commands:
python manage.py makemigrations
python manage.py migrate
Slides are taken from Andrew Ng, Machine Learning by Stanford University, Coursera.