# Monthly Archives: March 2012

## Tanking Works

Hey everyone!  I decided to make a quick run out from my hiatus to talk about tanking.  Everyone is doing it, so I thought it was worth taking a quick look at the issue.  It’s been all over the TrueHoop … Continue reading

Posted in Uncategorized | Tagged , | 2 Comments

## Hiatus

Hey everyone – sorry for abandoning the blog for a few days there.  I caught whatever’s been going around, and was spending some quality time with my couch and less time with everything else.  But, right after this I’ll get … Continue reading

Posted in Uncategorized | 9 Comments

## RAPM – Conclusions

So that was a long series on regularized regression, the technique behind RAPM.  I covered what it is and how it tends to react to collinearity, sample size, noise, and the number of predictors.  I thought a bit of a … Continue reading

Posted in Uncategorized | | 14 Comments

## Regularization: The R in RAPM – Predictors

The last thing I wanted to look at with regularized regression is the impact of the number of predictors.  With NBA data, you would optimally get an estimate for every player who gets in a game.  However, due to collinearity … Continue reading

Posted in Uncategorized | | 1 Comment

## Regularization: The R in RAPM – Noise

For my penultimate look at regularized regression, I wanted to see how noise affected things.  We know that NBA play-by-play data is very noisy; the R squared is, at best, maybe .1.  How does noise affect the benefits of ridge … Continue reading

Posted in Uncategorized | | 1 Comment

## Regularization: The R in RAPM – Sample Size

Moving right along with my look at regularized, or ridge, regression, this time I’m going to look at the impact of having more data.  I mentioned previously that we might expect better estimates when more data are included; as players … Continue reading

Posted in Uncategorized | | 1 Comment

## Regularization: The R in RAPM – Collinearity

As mentioned in the background, the main benefit of regularized regression over standard regression is how it deals with collinearity.  When your predictors are correlated with each other, standard regression becomes unreliable but ridge regression attempts to work around the … Continue reading

Posted in Uncategorized | | 1 Comment