Share & grow the world's knowledge!

- RingGiver on eli5 wrestling champions
- iliveoffofbagels on eli5 wrestling champions
- melon1924 on What happens if a pregnant woman eats a food that her fetus is deathly allergic to?
- Barron_1967 on eli5 wrestling champions
- Exlibro on eli5 wrestling champions

Copyright © 2023 AnswerCult

So, regular regression minimizes the sum of squared residuals using a small set of parameters that define a linear model of your outcome. This means that outliers have a huge impact on your model. It also means that the model will fail to fit non-linear relationships between your regressors and your output (i.e. you can’t draw a line to fit a quadratic trend).

There are, of course, *lots* of ways to address these problems. LOESS regression is one of many of them.

LOESS is a form of locally weighted regression. These methods are non-parametric. This means that, rather than solving for a few unknowns that define a line, parabola, or whatever form your model takes you simply have your training data and an algorithm to make predictions based on it. Basically, given any data point which you would like to predict, you construct a model that is fit on the training observations that are “close” to it in your regressor dimensions and you add more weight to observations which are closer. What constitutes close is determined by the researcher.

The model you fit using that subset of points can be pretty much anything in a locally weighted regression. However, in LOESS in particular a low degree polynomial (rather than linear) model is used. I think the animation at the top of the wiki page [here](https://en.wikipedia.org/wiki/Savitzky%E2%80%93Golay_filter) does a good job of demonstrating what it looks like to fit polynomials to specific “windows” of data in a series. The process for LOESS is basically defining the window based on the point you want to predict, and deriving that polynomial to make your prediction.