The attached graph examines the standard deviation of the batting averages for all players in each season 1900 through 2009. The number of games, (wherein the player accrued at least 1.6667 At bats per game,) along the bottom represents the cutoff for considering players not batters in that year. I did not correct for stints. (Some one please do that for me or wait till I learn PHP and MySQL and R.)
Anyway, as the cut off minimum increases, the variability in the remaining pool changes. When the cut off nears 50 the variability seems to settle down. A one hundred game cut off needlessly depreciates the accomplishments of high achieving batters by comparing them to far fewer batters.
Does this suggest that a 50-60 game Spring training season would aid fantasy leaguers? Cut off some of those meaningless games near the 162 mark and give more walk-on incentive!
CACTUS MITCH is for LONGER SPRING TRAINING! Is that because in lives in Arizona?
Establishing a meaningful low end threshold for statistical comparisons seems important to me. The 100 At Bats that Gould used seems to high, especially when making historical comparisons between modern players historic players who had many fewer games in their seasons.