VOCABULARY

# Mean Square Error

Mean Square Error (MSE) is a statistical metric that measures the average of the squares of the errors or deviations. In other words, it quantifies the difference between the estimator and what is estimated. MSE is a risk function, corresponding to the expected value of the squared error loss.

It's a popular choice for loss function in regression problems, and also in the field of Machine Learning for the evaluation of the performance of a model.

## Mean Square Error in practice

Considering a vector of n predictions generated from a sample of n data points on all variables, and a vector of observed values, the Mean Square Error is calculated by taking the differences between predicted and observed values, squaring those differences to remove any negative signs, adding them all together and then dividing by the number of data points.

The MSE provides a globally aggregated, single-figure measure of prediction accuracy. The smaller the Mean Squared Error, the closer we are to finding the best fit line. A mean square error of zero indicates perfect skill, or no error. However, keep in mind that a low MSE does not necessarily mean a good model performance, the context and the domain of the problem must also be considered. The model's performance also depends on the type of data, the skewness of data, etc.

The formula for the MSE is:

MSE = (1/n) * Σ(actual - prediction)²

If predictions are totally off, MSE will be large. Conversely, if they’re pretty good, MSE will be small. Learn how to protect against the most common LLM vulnerabilities

Download this guide to delve into the most common LLM security risks and ways to mitigate them.

Related terms Activate
untouchable mode.