Following is the description for how to find the weights of a linear equation using graduent descent. In this code, we will have our linear equation with wweights given. We will randomly generate our response and normally distributed error with mean 0 and SD 2. We will also have our response calculated from the given equation. Then, we will calculate weights using gradient descent. All the code is written in R and it also plots graph for how Gradient is decreasing at each iterations.
-
Notifications
You must be signed in to change notification settings - Fork 0
anuragkumar/Gradient-Descent-for-Squared-Loss
About
Batch Gradient Descent for Supervised Machine Learning
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published