Member-only story

Linear Regression (Part-2)

Krishna
10 min readOct 16, 2022

--

In part 1 we have understood how the Linear regression algorithm works and how it can predict the target values by understanding the current and previous data. In this part, we will understand how to evaluate the predictions predicted by the model and understand what is bias and variance trade-offs and how polynomial regression works.

Evaluation of results predicted by Linear Regression model.

Before we jump into the discussion first we need to understand what is an Error.

Error

In general, an error is a mistake if we elaborate on it from an ML perspective, an error is a misclassification or wrong prediction to actual classifications or predictions.

ML algorithms can be used either for prediction or classifications and few can be used for both, in all cases each time when the machine is trained over the training data an error metric should be used to calculate the error which would give us the information about how well the machine is performing its job. For regression tasks, we calculate the mean square error (MSE), mean absolute error (MAE), and root means square error (RMSE) and there are a few more but in regular we use these metrics to measure the errors.

Let me explain the error evaluation using an example of house prices,

--

--

Krishna
Krishna

Written by Krishna

Machine learning | Statistics | Neural Networks | data Visualisation, Data science aspirant, fictional stories, sharing my knowledge through blogs.

No responses yet