Back in the Spring when I first tried to replicate Rob Vollman’s instructions on how to make projections I missed a vital step. I forgot to account for and remove the random variation in my weighted Pts Total before adding the factoring in the age regression. This didn’t happen because I didn’t know that I had to, I just didn’t know HOW to do it. Not until I purchased Stat Shot by Rob did I get a better understanding of how to do it.
So what exactly does random variation mean and how do you remove it?
Not all players in the NHL are consistent from year to year. Players who have large swings in their Pts Totals from season to season are more likely to have been influenced by random events. Players who have consistent Pts Totals have less random variation.
To remove random variation, fist we determine how correlated the weighted average (or base Pts Total) is to a players upcoming season's Pts Total. When the weight average strongly matches the actual results, there is high correlation. Vice a versa, when the average is all over the place there is a low correlation and more random variation.
If you created projections using our tools, we would have tested how correlated your weighted average was by using Players Pts totals from the 2014/15 to 2016/17 season. We then determined the correlation coefficient to the 2017/18 season (using the 503 players that played all four of those seasons).
Once the correlations coefficient is determined, the player's weighted Pts Total needs to be regressed toward the league average. This is achieved by adding the weighted average (multiplied by the square root of the correlation coefficient) and the leagues average Pts Total (multiplied by one minus the square root of the correlation coefficient) to get a new Pts Total base without all that random variation.
If you want to know more, I suggest you get Stat Shot by Rob Vollman. He eloquently outlines the process on page 31 in his book, in a way that I only envy.
Here is the eloquent calculation done through not so eloquent sql code.