/cdn.vox-cdn.com/uploads/chorus_image/image/48534269/usa-today-8573471.0.jpg)
I've never been a big supporter of differentiating between earned and unearned runs. The theory is quite intuitive - pitchers shouldn't be penalized for defensive miscues behind them - but in practice it means introducing all sorts of bias that drown signals in noise. In particular, it is well established that the likelihood of an error being awarded is highly correlated to the ease of the ball botched, when it is equally well established that the prevention of runs by fielders is mostly about range. That is, differences in converting routine plays are far less important than differences in handling marginal opportunities.
A secondary consideration related to the above is that ERA is biased towards ground ball pitchers and against fly ball pitchers, as errors are seldom awarded when balls drop in that should be caught if they don't touch a fielder before falling. Consider, for example, the 0.28 career ERA-RA/9 gap between extreme flyballer Marco Estrada (3.95 v. 4.28) and the 0.60 career gap for extreme ground baller Charlie Morton (4.54 v. 5.14).
For these reasons, I think it makes a lot more sense to just use Runs Allowed per Nine Innings (RA/9), ideally with a decent adjustment for team fielding quality, as Baseball-Reference does in their WAR calculation (though I'd still pick unadjusted RA/9 over ERA). But largely owing to the historical weight of ERA, metrics based on overall runs are almost invisiable, especially ones that adjust for context. Fangraphs presents RA/9-WAR, but that's basically it and RA/9 or RA/9- are neither on its leaderboards nor available to be added (even B-R's only adjusted pitching stat is ERA+). So in looking to calculate RA/9- myself, I was looking at the historical gap between earned and unearned runs, and was quite surprised by what I found:
This covers almost 150 years of recorded baseball, with league ERA and RA/9 (left scale), and the light blue showing the percentage runs that were earned (right scale). In the earliest days, less than half of runs were earned, though this rose quickly and sharply, getting 70% by the dawn of the 20th century. There's another sharp rise as the deadball era ends, and by 1920 both the overall offensive environment and earned/unearned ratio have broadly stabilized.
Or did they? Let's zoom in starting with the end of the WWII and the beginning of integration:
This time I've only shown the gap between earned and unearned runs, as well as the percentage. While it's drawfed by what happened before, there's still been a big change, as the number of unearned runs per game has steadily declined, roughly halving from 0.60 to 0.30 unearned runs today. This is somewhat affected by changes in run scoring, and in percentage terms we see 25 to 30 years of stability in the high-80% range until the 1970s, a steady rise that bring us to the plateau of the last 15 years around 92%.
This doesn't much matter comparing pitchers of the same generation, but there's huge impliactions comparing pitchers of difference generations. Exhibit A: Curt Schilling. Ignoring differences in the offensive environment, his 3.46 career ERA does not look particularly impressive by Hall of Fame standards, as he'd slot in 10th from the bottom among the 75 inductee pitchers. And this is a common criticism by those who don't vote for him.. However, on a RA/9 basis, his 3.64 mark would slot in almost right in the middle of those same inductees.. His ERA-RA/9 gap of just 0.18 is 50% smaller than the next closest (Pedro, Smoltz and Eckersley). Factor in the offensive era, and he's a slam dunk.
The earned/unearned distinction is less significant than ever in today's game. But in my view, it's best done away with entirely.