Why you should never break a promise to yourself

I recently heard the relationship expert Matthew Hussey say that the worst thing we could do to our self confidence is break a promise to ourself. You might ask how is that related? Well, when a…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




Regression Analysis

Regression analysis models Explained and Implemented Using Python

Regression analysis is a set of statistical operations to estimate the relationships between an independent variable (X) and a dependent variables (Y).

The most Simple form of regression analysis is linear regression , where the goal is to finds a suitable line that will fit our data , but sometimes linear regression is not suitable for the problem, since there is a polynomial relationship between the independent variable (X) and the dependent variables (Y).

Linear Regression Model can be represented Mathematically Using This Formula :

Linear Regression Formula

where X is our independent variable and Y is our dependent Variable , alpha and beta are the model parameters , and epsilon is the Error .

1. Linear Regression with Least Squares Method .

The method of least squares is a standard approach in regression analysis to approximate the solution , by minimizing the sum of the squares of the residuals made in the results of every single equation.

Least Squares Method Estimate the Parameters alpha and beta Using This Simple Formula :

Least Square estimation formula

but wait a minute before we can use this formulas we need to know how they comes , The Explanation can be described as below :

The Implementation of Linear Regression with Least Squares in Python :

The Least Squares Method Gives us This beautiful line :

Linear Regression Using Least Square Method

2. Linear Regression With Gradient descent .

Gradient descent is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take repeated steps in the opposite direction of the gradient of the function at the current point .

Gradient Descent Update formula

in this time we know which Algorithm , we’re gonna use for the Optimization process , but wait a minute which function we’re gonna optimize ?

Ladies and gentlemen, this function is called the cost function and it is used to measure the difference between the expected value and the real value, we can see it as the function that gives us an idea of how far away the expected value from the real .

in our case the cost function is defined as below :

Cost function for Linear Regression

The Derivative of The Cost function with respect to the parameters is :

The Implementation of Linear Regression with Gradient Descent in Python :

The Gradient descent Gives us This Result , which very good actually :

Linear Regression with Gradient Descent result

The Cost Function vs The Iterations of The Gradient Descent :

Linear Regression with Gradient Descent Cost

polynomial regression is a form of regression analysis in which the relationship between the independent variable x and the dependent variable y is modelled as an nth degree polynomial in x . The Polynomial Regression Model Mathematically can be described as below :

Polynomial Regression Math Formula

polynomial Regression in a Matrix Representation can be seen like below :

1. Polynomial Regression Using Ordinary least squares Estimation :

to calculate The coefficients we use The Normal Equation , which is represented as :

Normal Equation

but wait a minute before we can use this formula we need to know how it comes , The Explanation of The Normal equation can be described as below .

The Implementation of Polynomial Regression Using Normal Equation :

Polynomial Regression Using Normal Equation gives us this result :

Polynomial Regression Using Normal Equation Result

2. Polynomial Regression Using Gradient Descent :

Gradient descent is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take repeated steps in the opposite direction of the gradient of the function at the current point .

Gradient Descent Formula

in this time we know which Algorithm , we’re gonna use for the Optimization process , but wait a minute which function we’re gonna optimize ?

Ladies and gentlemen, this function is called the cost function and it is used to measure the difference between the expected value and the real value, we can see it as the function that gives us an idea of how far away the expected value from the real .

in our case the cost function is defined as below :

Cost Function for Regression Model

The Implementation of The Polynomial Regression Using Gradient Descent :

Polynomial Reression Using Gradient Descent gives us this result :

Polynomial Regression Using Gradient Descent Result

The Cost of The Polynomial Reression Using Gradient Descent is :

Cost of The Polynomial Regression Using Gradient Descent

Add a comment

Related posts:

Zebra

We were lying in bed this morning, just a couple of hours ago, and he was staring into my eyes and stroking my cheek, an amused smile tickling the flesh of his lips as if invisible secrets were…

Autoencoders

An autoencoder is a special type of feedforward neural network where the input is as same the output. It compresses the input into a lower dimension and then reconstructs the output from this…