/cdn.vox-cdn.com/uploads/chorus_image/image/70687051/50599636.0.jpg)
In February, MLB floated the concept of a further reduction in the minor leagues after having restructured them eliminate short season teams over the past couple years as part of negotiations with the MLBPA on a new collective agreement. It didn’t go anywhere and was seemingly quickly dropped, but MLB’s goal was clear in seeking to further rein in the cost of player development.
A number of observers point out that no other sport has such an extensive minor league system, and in that respect it’s a vestigial remnant of a by gone era. Setting aside whether baseball is comparable to other sports where college functions as a de facto (and unpaid) minor league development system, it occurred to me that even if it were possible MLB might want to be careful what it wishes for and the potential long term implications.
After all, one of the principle reasons for teams having six (and often effectively seven) years of control over players was the argument that teams make huge investments in player development and they have to be able to recoup those investments. That lengthy control underpins teams’ ability to suppress player earnings and drive profitability and franchise valuations.
But drastically reduce those costs, and basis for lengthy control is called into question. To be clear, it would still take a huge fight by the players to achieve change, but it may make them more willing to try. It struck me there’s an interesting parallel here to 1994, when it was the owners who were willing to go to the mat to achieve a new economic system and ultimately resulting in the playoffs. They lost that fight, and the great irony is that decades later it’s actually to their benefit that they did.
It’s important to set the background for what happened in 1994. After free agency and salary arbitration were instituted, salaries grew rapidly and predictably a lot of owners claimed financial ruin and an insistence on mechanisms to restrain them. But they were a house divided, and they couldn’t achieve this at the bargaining table.
Instead, they recruited Peter Ueberroth as commissioner/financial disciplinarian, who brought in the system of collusion that froze the free agent market in the 1985-87 offseasons. It was successful in driving down salaries, from almost 40% of revenues in 1985 down to 30% in 1989 (by MLB’s books, which should be taken with a grain of salt).
When the collusion system ended in 1988, salaries climbed rapidly against the backdrop of surging TV revenues, by almost 50% in two years. Pivoting strategies, the owners tried a lockout in 1990 spring training, but that ended up collapsing with a largely status quo agreement through 1993. Salaries surged further, both in absolute and relative terms, rising to about 50% of revenue by 1992.
Over time, the small market hardliners had gained the upper hand, culminating with the ousting of Fay Vincent as commissioner and Bud Selig becoming acting commissioner. They were hellbent on achieving a hard salary cap, and use hardball tactics to get it. When negotiations deadlock, U.S. labour law permits management to declare an impasse and impose new terms unilaterally.
With the big CBS TV contract up after 1993 and an expected halving of the annual rights fee, the owners initially wanted a new system in place prior to the 1993-94 offseason. In their zeal they predictably bungled the tactics, opening the door to the players striking, thus wiping out the playoffs and last big TV payday the owners were counting on. This was only averted by a promise not to impose any agreement that offseason.
Thus the 1994 season began without a collective agreement in place. Negotiations went nowhere, and it was clear the owners were going to impose a new system that winter. The players best leverage was holding the stretch run and postseason in the balance, so they went on strike in August. The World Series was cancelled in September, and shortly after MLB declared an impasse and set an early December deadline to implement their system.
It was pushed back a couple times, but on December 23rd, 1994, the system was imposed and the battle shifted to court The MLBPA filed unfair bargaining charges with the National Labor Relations Board, which agreed with them and filed for an injunction in federal court to block it. If it was granted, the players would end their strike since their goal was to prevent it in the first place.
On the eve of the season beginning with replacement players, a young judge named Sonia Sotomayor found MLB had acted in bad faith and granted the motion. Once an emergency appeal was denied the owners declined to lockout the players, and the new system was dead before it ever took flight as was effectively the prospect of MLB imposing anything. It took another 18 months until after the 1996 to agree to a new collective agreement.
So what exactly was the basis of this new system that Bud Selig and the owners wanted so badly, and that the players resisted so stridently?
The main points of the model MLB imposed in December 1994:
- 40-man players to receive 50% of “defined revenues” (including benefits)
- Salary cap of 110% of the average and a salary floor of 84%
- Salary arbitration eliminated; graduated minimum salaries based on service time of $115K/$175K/$275K/$500K for less than one/two/three/four years, and $750K the last year if ranked as a Type A or B player (top 50%).
- Restricted free agency for players with four years of service; teams must issue a qualifying offer with a 10% raise on previous season and then can match any offer
In 1993, player salaries including benefits had reached 58% of revenues, so this system would have significantly rolled back the player’s share of the pie.
But what would things look like if this system was in place today? We don’t know what exactly constitutes “defined revenues”, especially given MLB’s history of diverting/hiding baseball-related revenue streams. For 2019, Forbes reported gross MLB revenues at $10.7-billion, and $9.7-billion net of “expenses and other investments”. But let’s just call it $10-billion in defined revenues, which would put the player’s share at $5-billion.
Per the Associated Press, player salaries in 2019 amounted to $4.22-billion for 40-man players, plus about $15.5-million per team in benefits. That would work out to $4.69-billion, and the total for CBT purposes (calculated slightly differently) was almost the same at $4.71-billion. That would mean another $300-million in spending under a 50% plan, and minimum $150-million even if the defined revenues were the $9.7-billion figure, potentially significantly higher.
By definition, the average payroll would work out to $167.7-million ($5-billion divided by 30 teams), with an upper limit of $183-3-million at 110% and lower bound at $140-million. At the high end, it would have restrained the spending of nine teams by a total of about $300-million. But on the lower end, it would have forced 13 teams to spend more, by a total of about $500-million. Though teams would surely figure out ways to circumvent the limits by creative contract transferring, it would go a long way towards ending tanking.
The average player might be better off, but what would distributional effects? It’s hard to say exactly how a graduated minimum might have been increased compared to how the single minimum was actually increased, but given the link to revenues it’s plausible it would have been in line with revenue. That would put the four thresholds at roughly $600K, $950K, $1.5-million and $2.7-million ($4-million for top 50%) for 2019.
At the low end for players under one year of service, the minimums would be a little less than under the new agreement (though a little higher than actually in 2019). But other current ore-arb players would be better off, significantly so at the two year mark. Given that a majority of players don’t make to arbitration eligibility, a graduated minimum would be a net benefit to a good number of players.
The drawback would be losing salary arbitration for established regulars who command significant arbitration salaries from the beginning, and would be stuck with something like $4-million instead (or $1.5-million in third year if they’d have qualified for Super Two). For at least a couple dozen players annually, this would bite.
But the restricted free agency provision would more than offset it, and in fact the players most hurt by not having a year of salary arbitration would most benefit from being able to hit the open market two years earlier and with two more years of their prime to market. It would particularly benefit players who debut later, or who end up having short careers and never getting to hit the market near their peaks.
Given the sea change in understanding player aging and paying for (expected) performance accordingly, it’s safe to MLB would never offer this today. Hindsight is 20/20, but it’s almost certainly the case that if they could go back in time, the players would be much better off going back in time and trading salary arbitration (which was vehemently hated by small market owners) for restricted free agency.
The economic system that MLB tried to impose in 1994 is a perfect example of the difficulty in predicting the course of future developments. In a couple decades, it has gone from something that represented a maximalist position by MLB, to something that would on the whole would actually be beneficial to the players today.
Loading comments...