In some heavily parameterized models, one may benefit from shifting some of parameters towards a common target. We consider L2 shrinkage towards an equal parameter value that balances between unrestricted estimation (i.e. allowing full heterogeneity) and estimation under equality restriction (i.e. imposing full homogeneity).
The penalty parameter of such ridge regression estimator is tuned using leave-one-out cross-validation. The reduction in predictive mean squared error tends to increase with the dimensionality of the parameter set.
We illustrate the benefit of such shrinkage with a few stylized examples. We also work out an example of a heterogeneous panel model, including estimation on real data.