The problem, Sandy, is that you're not looking at these numbers on the proper scales.
You're erroneously concluding that fielders make a 1% change where pitchers make a 14% change to achieve the same results. But the4 fielders AREN'T making merely a 1% adjustment...it takes a SIGNIFICANT change in fielding ability to get a 1% change in BABIP. Why? Because most batted balls are routine hits or routine outs. There are in fact very few plays in a baseball season that could go either way. What's missing from your analysis is the concept of the production margin (zero-value margin).
Here's how I would re-express your example, first in absolute units, then in marginal units.
Scenario A, better pitchers, average fielders:
800 Ks instead of 700
1900 fielded outs (of which about 500 were non-routine to at least some level)
33 hits saved
Scenario B, average pitchers, better fielders:
700 Ks
2000 fielded outs (of which 600 were non-routine - thereby producing the...)
33 hits saved
The fielders have to make about the same skill adjustment for their hits saved figures to match. Numbers are for illustration purposes. I actually think that the non-routine players are even more rare than this and that rather than having to make 100 non-routine plays to save 33 hits, they have to make, say, 45 plays to save 33 hits because non-routine plays do not carry the standard BABIP...the average team misses most of them
Seriously Sandy...think carefully about this...when you watch a baseball game, how many times do you see the batter make contact and IMMEDIATELY say "out." How many times does the announcer raise his voice after witnessing an out recorded?
The problem we have here is scale...it's not correct to assume that 2000 outs have significantly more value than 700 Ks...you should know this intuitively...the margin on those outs is going to be exceedingly high compared to the margin on the Ks (because you stick some schmuck like A-Rod in at third base and he's still going to make like 95% of the out plays of the best third baseman in the league in the absolute counts. Look at the leaders in putouts in the outfield each year...the range for guys with at least 150 games played is something like 250 to 350 for the corner outfielders and 300-450 for the center fielders. That's the range from the absolute worst outfielders to the absolute best. But we KNOW that the worst outfielders are not worth 2/3 of the value of the best ones...they're worth NOTHING (or even negative value) while the best ones are worth multiple wins. The reason is as described above...most plays are routine and contribute no real value to a team. Only the plays that get missed by the bad fielding teams but made by the good fielding teams have value.
Compare that to the typical ranges of Ks...750 to 1100 or more for some teams. Not only is the spread somewhat bigger, but the base value is a MUCH smaller percentage of the top value for Ks than it is for fielding outs per batted ball.
You're getting yourself all twisted up trying to account for things on a linear surface, but you need a PYTHAGOREAN surface with a marginal baseline to make any sense of team data.
Add new comment
1