From SABR member David Kagan at The Hardball Times on February 5, 2020:
In an earlier article here at The Hardball Times,
I described the common procedure for measuring the bounciness of a baseball. In case you don’t have time to peruse the article itself, here’s the quote:
The standard methodology is to fire a 60-mph ball at a wall of northern white ash (a common wood used for bats) and measure the speed of the rebounding ball. The rules require the rebound to be 32.76 ± 1.92 mph.
The standard way to report this value is as a fraction of the initial speed. This fraction is called the “coefficient of restitution,” or COR. The COR, then, must be 0.546 ± 0.032.”
A reader comment on this article brought up a very salient point:
Why would they measure COR at a relative velocity of only 60 mph? Why not something closer to the practical relative velocity of 200 mph? Maybe the average COR would be the same, but maybe not.
One can quibble with the 200 mph figure because, under game conditions, the pitch is likely between 80 and 100 mph, and the relevant portion of the bat is moving about 70 mph. So, 160 mph is more reasonable. Nonetheless, why measure COR at 60 mph?
Since 1998, the American Society for Testing and Materials (ASTM) has thoroughly described the method for measuring the COR of baseballs, beginning with F1887-98 – Standard Test Method for Measuring the Coefficient of Restitution (COR) of Baseballs and Softballs. This method uses baseballs moving at 60 mph. Occam’s Razor suggests the speed was chosen because that was the highest speed that would produce repeatable values with the technology available back then.
Read the full article here: https://tht.fangraphs.com/the-physics-of-cor-and-other-measures-of-bounciness/
Originally published: February 7, 2020. Last Updated: February 7, 2020.