This article was written by Dick Cramer
This article was published in the 1975 Baseball Research Journal
A Hot Stove League question was the impetus for this study. “Why have batting averages declined since the 1920-1940 era?” A popular answer was the rise of relief pitching. Formerly, the argument runs, batters during their fourth and fifth game appearances faced a tiring pitcher, whose stuff and delivery they had had a chance to become familiar with. Nowadays at the first sign of weakness by a starter, a fresh reliever is rushed into the game, often armed with a tricky specialty pitch. The batter is deprived of a soft touch in his final appearances and his average suffers.
Unlike most theories about changes in major league play, the effect of any changes in relief pitching on batting averages can be measured. Improved relief pitching should mean less run scoring in the late innings of today’s games, compared with the era when most games were complete. Also, relief pitchers might have lower ERA’s than starters. To perform these tests, data were gathered for ten representative seasons, mostly from contemporary newspapers and the Macmillan and Neft encyclopedias.
The first section of the compiled results on the next page suggests that there were three phases in the rise of the relief pitcher, a more complicated history than I had expected to find. The year 1904 marked the end of the first, or pre-relief, phase, in which starters finished more than 80% of their games (line B) and saves were recorded in about one of 20 games played (line C). Existing records for saves (5) and “points” (9) had been set by Boston outfielder Jack Manning, who pitched in 14 relief games in 1876. During the ensuing years, every other staff completed at least 75% of its starts. And in 1904 Boston and St. Louis established the all-time complete game records of 148 and 146, in the American and National Leagues, respectively.
But 1905 saw signs of the first upheaval in pitching strategy. The revolutionary managers were John McGraw and Clark Griffith. McGraw’s champions unveiled the first relief “specialist,” Claude Elliott, whose eight relief stints among ten appearances yielded a record 6 saves.
INTERACTIONS BETWEEN RELIEF PITCHING AND RUN SCORING
Season 1895 1901 1904 1911 1959 1946 1958 1968 1972 1973
I. THE USE OF RELIEF PITCHER
A. Major league games played
(1560) 2219 2497 2474 2460 2484 2472 3250 3718 3886
B. % of games completed by starting pitcher
82 86 88 58 48 42 30 28 27 27
C. % of games in which a save was recorded
6 3 3 12 18 18 33 37 42 39
D. % of all innings played that were pitched by a reliever
– 6 – 18 22 – – 26 – (22)
E. % of all innings played, pitched by a “fireman” in relief
– 3 – 7 7 – – 10 – (11)
F. % of relief innings by a pitcher who pitched more starting
– 90 – 86 67 – – 9 – –
G. % of saves recorded by pitchers who appeared more starting
78 100 95 74 54 34 17 7 5 4
II. THE PERFORMANCE OF RELIEF PITCHERS
H. Composite ERA of relief innings relative to starter innings
– .76 – – .92 – – .99 – (.99)
I.% of relief innings pitched by “firemen”
– 45 – 39 34 – – 39 – (50)
J. ERA of fireman-pitched innings relative to starter innings
– .96 – – 1.20 – – 1.17 – (1.24)
K. ERA of “mopup” innings relative to starter innings
– .64 – – .82 – – .90 – (.84)
III. DISTRIBUTION OF RUN SCORING BY INNINGS
L. Games included in sample
258 250 – 238 250 – 1564 – 536 376
M. Average # runs scored per inning in sample
.78 .59 – .60 .58 – .48 – .40 .48
N. Run scoring by inning, relative to average (line M)
1st 1.08 1.17 – .97 1.08 – 1.12 – 1.19 1.22
2nd .99 .88 – .70 .77 – .80 – .84 .87
3rd .93 1.14 – 1.01 1.15 – 1.12 – .96 1.01
4th .83 .83 – .98 .96 – .93 – .99 .96
5th .93 .87 – 1.03 1.04 – .95 – .94 1.12
6th 1.00 1.01 – 1.18 1.00 – 1.09 – .97 1.12
7th 1.22 .92 – .98 .95 – .97 – 1.03 .73
8th 1.06 .76 – 1.11 1.00 – 1.05 – 1.24 1.11
9th .95 1.42 – 1.04 1.06 – .95 – .83 .87
O. Run scoring in innings 7-9, relative to average (line M)
1.08 1.03 – 1.04 1.00 – .99 – 1.03 .90
Even more prophetic were Griffith’s sixth place but respectable New York Highlanders, whose pitchers completed only 88 or 57% of their starts. The sudden frailty of Griffith’s staff is hard to understand, as it included four pitchers who each won 200 games and who at some time had been league leaders in complete games — Chesbro, Orth, Powell, and Griffith himself. Perhaps the Old Fox was impressed with the improvement in his own pitching when the duration of his efforts dropped to four innings/game (probably a record low at the time). His overall 1.66 ERA (unofficially second in the league) may in fact have made an indelible impression, considering the subsequent pioneering experiments of his Washington clubs with Marberry and the Russells as relief specialists.
By 1911, such frequent use of relief pitchers and pinch hitters had become more general, launching the second phase of relief pitching which would last until around 1946. From 1904 to 1911, the percentage of games completed dropped 30 points, from 88 to 58; during the next 35 years, the percentage dropped only 17 more points. Similarly, the 1911 frequency of saves, which had quadrupled in the previous seven years, increased by only half again during the second phase. In fact, the proportion of all innings pitched in relief (line D) has not increased much since 1911. And “fireman” relief innings – those by pitchers whose large number of relief wins, losses, and saves indicate preferential use in clutch situations — have not changed greatly (line E).
What facts then account for the general impression that relief pitching is a phenomenon of the last 30 years only? The answer is that relief pitching has only recently become a specialty. In 1929, for example, two-thirds of all relief innings came from pitchers whose major activity, as measured by innings pitched, was starting (line F). And the save leaders, Guy Bush of the Cubs (8) and Marberry of the Senators (11), both pitched more than 200 innings as starters. Throughout the second phase, there was an increasing tendency of managers to use relief specialists in crucial situations (line G). But as late as 1953, the save leaders of half the major league clubs also started ten or more games.
The final phase of relief pitching, characterized by nearly universal use of specialists, began in 1955, when only Roy Face among club save leaders started as many as ten games. The percentage of complete games has not declined from the 30% of 1955, although the proportion of games in which a save occurs has continued to climb, from 27% to about 40%. Evidently managers increasingly are deciding that a starting pitcher who probably won’t finish the game anyway might as well be removed at the first sign of trouble.
The second part of the table explores the effectiveness of relief pitching, relative to starters, based on seasons which seemed representative of each of the three phases. (Data shown for 1973 are based on the approximation of complete specialization, a particular pitcher being counted either as a starter or as a reliever.) One clear trend is that the quality of relief pitching, which in 1901 was only 76% as effective as starting pitching, has improved until composite relief ERA’s are about the same as starting ERA’s (line H). It should be added, however, that individual pitchers in all eras have tended, by 3:2 or better, to have better ERA’s in relief roles than in starting roles. Relief appearances of course are shorter, and provide less opportunity for a given pitcher to tire and lose effectiveness.
As mentioned above, relievers may be classified as “firemen” or “mopup” men; the former pick up an arbitrarily defined higher number (15 or more) or proportion (40% or more) of wins, losses, or saves during their relief appearances. The division of total relief innings between firemen and mopup men has not varied over the years (line I), but their effectiveness relative to starters has varied (lines J and K). Firemen of course are always more effective than mopup men, but in 1901 both types were weaker than starters. By 1929, firemen were more effective than starters — understandably, since at that time firemen usually were “starters” pitching too few innings to tire. The subsequent transition to the relief specialist phase has not changed the fireman’s margin of superiority. The quality of mopup men has improved but is less than that of starting pitchers.
Finally, what about the effect of more and better relief pitching on batting averages? The last part of the table is drawn from newspaper box scores, except for the 1958 data originally gathered by SABR member Ted Lindsey.
He noted the general phenomenon, shown here, that bunching of good hitters high in the lineup and weak hitters low in the order leads to consistently higher than average scoring in the first inning and lower in the second. However, it is the comparison of run scoring in the 7th through 9th innings across the different phases of relief pitching that will answer the original question.
From the summary for these innings (line 0), there would appear to be some decline in late inning run scoring, the relative run scoring overall in the last three innings being 1.055, 1.023, and .976 for the three phases. However, the difference among these values is so slight that even the pre-relief vs. the relief specialist difference would occur 15% of the time by chance if there were actually no difference between late inning batting performances. In other words, by the usual statistical criteria, the differences among these three values are not statistically significant. Neither are they practically significant.
Even if the apparent decline in late inning run-scoring resulted only from a dip in batting averages, not in walks or extra-base hits, the resulting declines in league batting averages would be 1.5 points in the first transition and 2.3 points in the second transition, which corresponds to the 1920-40 era vs. today. Since even with the designated hitter rule, there was a 32 point drop in league batting averages between 1929 and 1973, the effect of relief pitching on this drop is clearly trivial.