Skip to content

Commit 0abfd03

Browse files
committed
Removed LaTex equations
1 parent 236c4e4 commit 0abfd03

File tree

1 file changed

+4
-22
lines changed

1 file changed

+4
-22
lines changed

Linear Regression/README.markdown

Lines changed: 4 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -30,9 +30,8 @@ We can describe the straight line in terms of two variables:
3030

3131
This is the equation for our line:
3232

33-
$$
34-
carPrice = slope \times carAge + intercept
35-
$$
33+
carPrice = slope * carAge + intercept
34+
3635

3736
How can we find the best values for the intercept and the slope? Let's look at two different ways to do this.
3837

@@ -75,26 +74,9 @@ for n in 1...iterations {
7574

7675
The program loops through each data point (each car age and car price). For each data point it adjusts the intercept and the slope to bring them closer to the correct values. The equations used in the code to adjust the intercept and the slope are based on moving in the direction of the maximal reduction of these variables. This is a *gradient descent*.
7776

78-
We want to minimse the square of the distance between the line and the points. Let's define a function J which represents this distance - for simplicity we consider only one point here:
79-
80-
$$
81-
J \propto ((slope.carAge+intercept) - carPrice))^2
82-
$$
83-
84-
In order to move in the direction of maximal reduction, we take the partial derivative of this function with respect to the slope:
85-
86-
$$
87-
\frac{\partial J}{\partial (slope)} \propto (slope.carAge+intercept) - carPrice).carAge
88-
$$
89-
90-
And similarly for the intercept:
91-
92-
$$
93-
\frac{\partial J}{\partial (intercept)} \propto
94-
(slope.carAge+intercept) - carPrice)
95-
$$
77+
We want to minimse the square of the distance between the line and the points. We define a function J which represents this distance - for simplicity we consider only one point here. This function J is proprotional to ((slope.carAge+intercept) - carPrice))^2
9678

97-
We multiply these derivatives by our factor alpha and then use them to adjust the values of slope and intercept on each iteration.
79+
In order to move in the direction of maximal reduction, we take the partial derivative of this function with respect to the slope, and similarly for the intercept. We multiply these derivatives by our factor alpha and then use them to adjust the values of slope and intercept on each iteration.
9880

9981
Looking at the code, it intuitively makes sense - the larger the difference between the current predicted car Price and the actual car price, and the larger the value of ```alpha```, the greater the adjustments to the intercept and the slope.
10082

0 commit comments

Comments
 (0)