That's what he told us in his column today, because he sure didn't make much of an argument. Lane cites several recent papers showing that the minimum wage has no negative effect on employment (including my colleague, John Schmitt's paper). He then notes that these studies could be right, but he also refers to research by David Neumark of the University of California at Irvine and William Wascher of the Federal Reserve that shows the last minimum wage hike (from $5.15 an hour in 2007 to $7.25 in 2009) lowered employment of young people by 300,000.
He then warns that if their research is right, and we push the minimum wage too high, then we could be hurting the people we are trying to help. He also points out that even if the research showing no employment effect is right, then we would still be hurting other workers by pushing up prices or wage compression. He then proposes spending more money on the earned-income tax credit (EITC) as an alternative to a higher minimum wage.
Okay, let's have some fun here. Lane's bad story is that 300,000 fewer workers would be employed. That sounds really awful, after all these are the people we are trying to help. But let's think about this one for a moment. The jobs we are talking about tend to be high turnover jobs that workers only hold for relatively short periods of time. The research that Lane is depending on shows that at any point in time 300,000 fewer workers will be employed as a result of a minimum wage hike of more than 40 percent. In effect, this means that workers will on average have to spend more time between jobs looking for work.
More than 3 million workers were in the affected wage band between the old and new minimum wage. If we assume that on average they worked 10 percent less (the 300,000 job loss) and that their average hourly wage gain was 20 percent (half of the wage increase) then on average these workers will net roughly 8 percent more in pay each year (120 percent * 90 percent), while working 10 percent fewer hours. Pretty awful story, huh? And that's based on the research that finds a negative effect on employment.
As far as the rest Lane's story, yes, the higher pay for minimum wage workers comes mostly out of other workers' pockets. (Some comes from profits and some comes from increased productivity.) This is true of everyone's pay. If protectionists did not dominate national policy we could import more doctors and bring our doctors' pay in line with pay in other wealthy countries and save other workers close to $100 billion a year in health care costs. But the money that goes out of workers' pockets to support the excess pay of doctors, Wall Street bankers or CEOs doesn't concern Lane, only the money that goes to pay custodians, retail clerks, and dishwashers.
But the best part is the idea that the EITC is somehow free. In fact we need government revenue to pay the EITC, which requires taxes. Taxes will also come out of workers' pockets and also have a distorting effect on the economy. In addition, there are also costs associated with administering the EITC. While it does not have nearly the level of fraud claimed by its critics, clearly some portion of the money is paid out improperly. And, low-wage workers often have difficulty dealing with tax returns. Many throw hundreds of dollars in the garbage paying tax preparation services in order to claim their EITC.
In short, Lane doesn't really have much of a case against a higher minimum wage even if we accept his bad story about job loss. And, he seems to have imagined that there is an alternative costless way to get more money to low-paid workers.
There is one final point worth noting in the context of proposals to increase the minimum wage. From its inception in 1938 to 1969, the minimum wage rose in step with economy-wide productivity growth. If we had continued this policy over the last four decades the minimum wage would be $16.50 an hour. Even if the minimum wage is raised to 9.00 an hour, minimum wage workers would get none of the benefits of economic growth over the last four decades.
Charles Lane wrote to tell me that I had misrepresented the Neumark estimate of the employment impact of the last minimum wage hike. Neumark was only referring to the impact of the last phase of the increase (which was phased in over three years) from 6.55 to 7.25, a rise of 10.7 percent, not the increase from 5.15 that I had referred to in my initial note.
I should have looked at his reference in the column. I'll admit that I have not taken Neumark's work on the minimum wage seriously since he uncritically took data from the fast food industry lobby to try to argue the case that the minimum wage caused unemployment. It turned out that the industry had cooked the data. When Neumark used data that was independently collected he found the same result as everyone else, the minimum wage did not increase unemployment.
But, even if we take Neumark's numbers at face value, we still don't get much of a horror story. A rise of 10.7 percent means that the average gain would be around 8.9 percent. (To see the logic, imagine that hourly earnings were originally distributed evenly between the prior minimum wage of $5.15 an hour and the new minimum wage of $7.25 an hour. After two rounds of minimum wage hikes, two thirds of the workers are now sitting at $6.55 an hour or close to it. The remaining third are evenly distributed across the remaining band. This would mean that two-thirds of the affected workers would recieve the full 10.7 percent increase, while the remaining third would see an average hike of 5.4 percent. This gives an average increase of 8.9 percent.)
Neumark's estimate would then imply workers are on average putting in 10 percent fewer hours and taking home 2 percent less money. This is based on the assessment of an economist who has devoted a career to trashing the minimum wage. Can't say that sounds like a horror story.