 # Least Squares Criterion

Updated on October 2, 2022 , 1069 views

## What is the Least Squares Criterion?

The least squares criterion refers to the formula that is used in order to measure the accuracy of a straight line in showing the data that was used to generate it. In other words, this formula determines the best fit for the line. The least squares formula helps in predicting the behaviour of dependent variables. In other words, the approach is also called the least-squares regression line. The least-squares criterion is determined by minimising the sum of squares that is created by a mathematical function. Determination of a square takes place by squaring the distance between a data point and the regression line for the mean value of the data set.

A set of data points are plotted on a graph paper and that is how least-squares analysis begins. Variables that are independent are plotted on the horizontal X-axis whereas variables that are dependent are plotted on the vertical Y-axis. The analyst will then use the least squares formula in order to determine the most accurate straight line which is eligible to explain the relationship between the dependent and independent variable.

## Uses of Least Squares Criterion

The use of least squares method example has extended due to the advances in computing power and Financial Engineering techniques. This method is also used in finance, Economics and Investing.

Time series analysis of return distributions, strategy and policy, forecasting the Economy and advanced option modelling all use least squares method.

## Importance of Least Squares

Mathematicians use this method to arrive at a close approximation instead of trying to solve an equation. This is also known as a maximum likelihood estimate. This method limits the distance between a function and the data points out what the function is trying to portray. It is used in nonlinear regression modelling where a curve fits into a set of data. It is also a very popular and important method for Data Mining regression equations where it tells you about the relationship between response and predictor variables. The straight-line method, logarithmic method, polynomial method, Gaussian method are all used when fitting a function to a curve.

Ordinary least squares or linear least squares is the easiest and commonly used linear regression estimator in order to analyse observational and experimental data. It is illustrated through a straight line of best fit through a set of data points.