One of the hardest things about entering a new field is learning the terminology. It’s even harder when you have already learned similar terms for years. Coming from a physics background to the world of machine learning and statistics, the terminology took a little getting used to. So I thought I’d clear it up a little bit for anyone who might be facing the same problems I faced.

Curve quick fitting
Curve quick fitting is one of the most common things you’ll do as an experimental physicist or pretty much any hard science. You gather a set of data, you visualize it, create a fit and build a model around that fit so you can interpolate. For majority of the time, if not every time, you know exactly what parameters are in the dataset as they correspond to some physical event. Building fits help you extract a mathematical equation that will dictate how the event will act in the future given the parameters are the same. Since you know the parameters (and in the event you know how the event was setup), you can tailor your errors and uncertainties more carefully. This can include human error, instrument error etc…

In the other way around, if you already have a theorized model, you can use curve quick fitting from experimental data to extract an equation and verify the theory that was derived without any data. This is where theoretical and experimental scientists play together.

Regression
Regression is a far more loaded term and has a lot of connections to machine learning. Admittedly, curve quick fitting also sounds simpler. It’s not. Regression analysis is most commonly used in forecasting and building predictions. It deals with the relationship between the independent variable and the dependent variables and how the dependent variables change when the independent variable is changed. More often than curve quick fitting, correlation does not always mean causation in regression analysis. On the more complex side, regression analysis can deal with “messier” and unstructured data (machine learning), but we won’t go into that as it’s beyond the scope of this text.

If it’s not clear from the above text, the difference between curve quick fitting and regression analysis is mostly how they are used. The distinction is very small, if there is one at all. The methods are the same (gather data, find best fit, build a model) but the goals are different and the little details on how you calculate error and such might be slightly different.

I hope this cleared it up a little bit for you.