A couple days ago, SuckaMD took a look at how various players who got $100-million contracts had performed over the course of their contracts, and came to the conclusion that they weren't necessarily as poor value as conventionally thought. If you haven't already read it, it's definitely worth reading before continuing here.
The main issue I had was that due to high inflation across baseball over the last decade, and even before that, it makes it difficult to compare value across a long-term contract in aggregate, which to me meant it's hard to draw an inference by totalling up WAR and comparing it to the total cost. I was initially just going to put a few thoughts in the comments, but as I looked deeper and crunched some numbers, more information and conclusions came out that I thought merited a seperate post. My approach would have been to calculate the value produced compared to the salary received in each year, calculate the surplus value, and then compare it to the total salary on a percentage basis to see whether players were overpaid or not. It looks like this for A-Rod's first contract:
Before I get to the data, a couple quick notes on my method:
- I loosened the criteria from $100M+ to $70M+ contracts. This gives us several elite players who signed contracts pre-2001, and were they signing today would easily be $100M+. It gives us a larger sample of 17 players, and yet all the players were pretty much elite players in their primes when they signed.
- I used an average of fWAR and rWAR, to smooth out some differences in defensive measurements and such.
- I didn't include the value of any team options (other than buyouts, which have to be paid), since the team is not obligated to pick them up. Likewise, any player options are included, since the team is committed to pay it. This really only figured into the first A-Rod contract.
- For the $/WAR value, I used Fangraphs' numbers, and for years prior to 2002 I extrapolated using the average growth from 2002-10.
- Most of the contract info comes from Cot's Contracts (except for where they didn't have some retired players), and of course WAR numbers from Fangraphs and Baseball-Reference.
- For contracts that are not complete, I've only included the production and salaries paid to date. I also didn't include any contracts signed after 2010 (Werth, Crawford, Beltre), since there's only one year of data, though it does not change the result materially.
On average, a free agent position player on a big contract only produces about 78% of the salary he's paid, and of the 17 players in the sample, only 4 produced positive value (24%). But that's not the end of the story, since there's bias here. We know that teams usually get more value on the front end of the contract, and take a hit on the back end. But almost one-third of the players in the sample still have years left, which means the latter parts of some contracts are missing. To adjust for this, I projected player performance for the remaining years of the contracts, using a 5/4/3 weight of 2009 to 2011 WAR, with a -0.5 WAR/year aging curve and 7% salary inflation. It's definitely not perfect, but serves as a quick and simple approximation. We get the following:
The intuition is correct, as the aggregate value produced compared to value received is slightly lower at 75%, though broadly in line with the above results.
Trends in the Results
First, I wanted to look at whether the larger the contract, the worse the ratio of production-to-salary, so I plotted the size of the contract against the Gain/Loss %:
The relationship appears to be very weak, and is modestly negative. However, on review, there is one clear outlier, the first Alex Rodriguez contract. Though widely considered a huge overpayment at the time it was signed, time has shown that it was good value in absolute terms, and particularly so relative to other large contracts. If we remove that contract, we get a much stronger relationship:
We can see from this relationship that on larger contracts, not only is loss of dollars proportionately larger, but in fact there's a double whammy - you also lose a higher percentage of the salary. Though the size of the contract only explains 20% of the total variation, that is reasonably strong considering that teams should be more wary the larger the deal. If anything, I expected a positive relationship to reflect risk aversion (a $75M contract isn't as crippling to a team budget as a $200M contract).
To me, this leads to three possible conclusions, which I shall analyze later in the article with some additional data:
1) Teams generally overbid for premium free agents and end up overpaying, and are victims of the winner's curse;
2) There are significant non-baseball revenues generated by premium free agents that aren't being captured in this analysis;
3) The value of premium free agents ($/WAR) isn't linear, and the balancing factors that result in linear $/WAR for most free agents are different for premium free agents.
The second trend I want to investigate is whether over time, have teams got smarter at getting value from big contracts. I plotted the Gain/Loss % against the year the contract was given out:
What we see a modest linear trend showing teams getting better value over time, however, the regression only explains about 5% of the variation. If we use a polynomial regression (of degree two) instead, we see that teams got better value over time, peaking around 2005, and since them have been getting worse value. The peak corresponds when with a time when owners were claiming to be losing money, so it's possible that the market was either more disciplined for a period of time, or that there were fewer teams that could afford these contracts. Also, the dollar value of the contracts given out were generally smaller, so it's possibly just the effect discussed above. I tried weighting each data point by the contract value and running the regression, but I feared that violated some OLS regression assumptions, so I left it out.. For what it's worth, the results of this indicated no relationship with the year whatsoever.
What About Non-Free Agents?
Finally, I wanted to look at another class of players who have received similarly large contracts, but not as free agents, in order to see how the value produced to salary ratio varies. This sample is made up of players who received contract extensions covering free agent years, but who didn't actually test free agency. In theory, these players have significant leverage (the ability to test free agency imminently), but can't negotiate with all teams to provoke an auction. In this group, I only include players who signed contracts buying out free agent years only, or where the vast majority of the years were free agent years (only one arb year, two in the case of Miguel Cabrera). I then excluded the salary and production from the years when they would not have been free agent eligible.
We see that the results here are much better for teams compared to buying similar talent in free agency. This would seem to suggest that even among elite players, the value is roughly close to the linear market rate. If there was value beyond the baseball production, we might expect to see it reflected here, as most of these players would be considered "franchise" players, yet really we don't. Therefore, it would seem to me that premium free agents certainly do get overpaid, at least to some extent. It's also possible that a selection bias is at work, and that a player's original team, who presumably has the most information about the player, make very smart decisions about whom to extend and whom to let test free agency. A couple notes on inclusions and exclusions:
- I struggled with whether to include Joe Mauer, since he is only one year into his extension and it was one of his worst years, but ultimately did since it was a huge extension that is similar to others included and belongs in the sample.
- Carlos Delgado is included because while the total contract value fell slightly under $70M, it was the highest annual value at the time he signed the deal, which is significant.
- I also excluded Ken Griffey Jr and Jeff Bagwell. Griffey's extension was at far below market value since he wanted to play at home in Cincinnati (Seattle offered him 8 years/$148M, he took $116.5M over 9 years, which would skew the data). Bagwell retired before the end of the contract and it wasn't clear to me exactly how much money he received.
Again, I looked at the same sample, projecting the rest of the contracts forward in the same manner as was done with the free agent sample, and there's a similar deterioration, though that is largely attributable to Mauer and Wells (otherwise it's almost breakeven overall).
On average, it appears that big contracts given out to free agents result in a loss of value of 20-25% of the contracts. There is minimal evidence that teams are getting better value over time, but that the larger the contract, the less production you expect to recoup relative to the salary commitment. When compared to extensions given out that cover free agents years, the loss appears to the greater, though the extension sample has smaller contracts on average.
I'm looking at doing pitchers next, depending on the interest level. I've done my best to not miss any contracts, but I've omitted one that fits the criteria, please leave a comment and I'll update and include.