Speaking of batting .400, everyone knows that Ted Williams was the last to do it--in 1941, when he batted .406. On the last day of the season, with his team, the Boston Red Sox, scheduled to play a doubleheader, Williams had 179 hits in 448 at-bats, so his average stood at an even .400 (actually, to five places right of the decimal point, 179/448 equals .39955, but that would have gone in the books as .400). I've read--I'd be interested in hearing from someone who knows whether it's true or just part of the Williams legend--that the Boston manager offered to hold Williams out of the doubleheader that day in order to preserve his .400 average. Williams is supposed to have said something along the line of "Nuts to that" before taking to the field, where he cracked out six hits in the two games, raising his average to .406.
Chipper Jones, should he be standing at .400 before the last game of the season, may be more sorely tempted than Williams was to sit it out. In the history of baseball up to 1941, a .400 average had been attained in quite a few seasons by many different players and had not, therefore, attained the nearly mystical status that all the dry years since 1941 have endowed it with. Which raises the question: why hasn't anyone batted .400 for almost 70 years?
The best answer to this question has been offered by a paleontologist. In "Losing the Edge," one of the essays collected in The Flamingo's Smile, Stephen Jay Gould advances the idea that a .400 batting average may be regarded as an outlier in a distribution of all batting averages. If in the course of time baseball strategies are perfected and the play becomes more regular, then variation within the system will decline, and outliers will be drawn toward the mean. It is not that Rogers Hornsby or Ty Cobb towered over the best batters of my lifetime. It's that they had the advantage of playing when the game was still young, before every pitch and hit was tracked and fielders and pitchers--and batters too--had not yet adopted optimal strategies backed by a history of experience and research. In 1897, when Wee Willie Keeler batted .424, fielders did not position themselves according to the meticulously researched tendencies of different batters. The 1897 outlier would, under the conditions of modern competition, not stand out so far from the mean.
The scientist in Gould must of course test his hypothesis. It turns out that the amount of variation within the system of batting averages has indeed declined considerably. For example, in the 1890s, the decade of Wee Willie Keeler's .424, the difference between the average of the five highest averages and the league average was 91 points. This figure became, in the next decade, 80, then 83, 81, 70, 69, 67, 70, and, in the 1970s, 68. For the difference between the lowest averages (of those who played regularly) and the league average, the numbers are smaller but exhibit an even greater percentage decline.
Gould's is a good theory and widely applicable. We are ourselves, as collections of atoms, extreme outliers that over the course of just 102 years settle back toward the dull mean.
Comments