Almost 90 percent of these workers are at least 20 years old; almost 70 percent are in families with incomes below $60,000 per year; over half work full time; and more than one-fourth have children. For complete demographic details, see this recent report by the Economic Policy Institute.
Note that the oft-cited official data from the Bureau of Labor Statistics on the "Characteristics of Minimum Wage Workers: 2012" don't tell the whole story because those data look only at hourly workers who earn exactly $7.25 per hour. Many workers who would otherwise be earning, say, $8.00 or $9.00 per hour, however, will also receive an increase under current proposals.
Won't raising the minimum wage hurt employment?
A report I wrote for CEPR last February reviews the major research on the minimum wage over the last several decades and concludes that: "[t]he weight of that evidence points to little or no employment response to modest increases in the minimum wage." The full report goes into detail, but the basic explanation is that historically minimum-wage increases have had only a small to moderate impact on employers --and firms have many ways to adjust to those costs without laying off workers. In economic terms, probably the most important of these adjustments is a reduction in worker turnover, which saves on hiring and training costs and raises firm productivity.
A recent letter signed by almost 600 economists, including seven Nobel prize winners and eight past presidents of the American Economic Association, agreed that "In recent years there have been important developments in the academic literature on the effect of increases in the minimum wage on employment, with the weight of evidence now showing that increases in the minimum wage have had little or no negative effect on the employment of minimum-wage workers, even during times of weakness in the labor market."
Isn't the minimum wage a blunt tool for helping low-income workers and their families?
As the Economic Policy Institute analysis I cited above demonstrates, almost 70 percent of the benefits of the minimum wage go to workers in families with incomes below $60,000 per year. Over half of the income gains go to workers in families with annual incomes of less than $40,000, and almost one-fourth to families with incomes below $20,000.
A recent, careful paper by economist Arindrajit Dube on the effects of the minimum wage on family incomes and poverty found that increases have a "small to moderate sized impact in reducing poverty" and that the benefits of the policy are concentrated in families in the bottom third of the income distribution. (For a CEPR blog post summarizing Dube's main findings, click here. For Dube's own blog summaries of his work, try here and here.)
The minimum wage disproportionately benefits low-income workers and their families. That the policy also helps workers and families with low incomes above the federal poverty line and even middle-income families is a feature, not a flaw of the policy.
Wouldn't it make more sense just to increase the Earned Income Tax Credit (EITC)?
The minimum wage and the EITC are strong policy complements, not substitutes.
The EITC is a wage subsidy that benefits both participating workers and employers. Economist Jesse Rothstein has estimated that about 27 cents of every dollar spent on the EITC goes to employers, rather than the low-wage, low-income workers that are the intended beneficiaries.
Employers capture part of the benefits because the EITC has the effect of lowering market wages. The EITC is intended to increase the incentive for working. As a result, it draws more people into the labor market and causes some people to work more hours. This increase in supply, in turn, drives down the market wage (before EITC benefits are paid), lowering employers' labor costs. (These effects are not small. Economist Andrew Leigh has estimated that "a ten percent increase in the generosity of the EITC is associated with a five percent fall in the wages of high school dropouts and a two percent fall in the wages of those with only a high school diploma.")
With this dynamic in mind, a higher minimum wage can reduce the portion of EITC expenditures that are lost to employers by limiting the degree to which the EITC-induced increase in supply can depress market wages.
How does the value of the minimum wage today compare with the past?
The short answer is that it depends on (1) what you use as a historical benchmark and (2) how you adjust for changes in prices between the chosen benchmark and the present.
The most common historical reference point is 1968, which is the year the minimum wage reached its highest level according to all of the most common methods used to convert historical values to today's dollars. Sometimes researchers also compare the current minimum wage to its value in 1979, the year when many indicators of economic inequality began to rise, following five decades when economic inequality was either falling or flat.
As for how best to convert historical levels of the minimum wage to current dollars, economists have used several different approaches. In a blog post last summer, Janelle Jones and I pulled together the most common of these. Putting the 1968 value of the minimum wage in today's dollars using the CPI-U, which was the official procedure in place to measure inflation over that period, the minimum wage in 2013 would have been about $10.75 per hour --almost 50 percent higher than its current value of $7.25.
But, the Bureau of Labor Statistics (BLS) has modified the way we measure inflation in the years 1968. If we use today's procedure and apply it back to 1968, the minimum wage in 2013 would have been about $9.50 per hour --30 percent higher than its current value.
Another common benchmark is to compare the minimum wage to what the average production worker made in 1968 ("production and non-supervisory workers" make up about 80 percent of the workforce). In 1968, the minimum wage was equal to 53 percent of the average production worker wage. In 2013, the minimum stood at just 36 percent of the average production worker wage ($20.16). If we round the 53 percent peak down to 50 percent, the 2013 minimum wage would have been just a few cents over $10.00 per hour.
A less common, but illustrative benchmark is average productivity growth in the economy. Between the end of World War II and the beginning of the 1970s, the value of the minimum wage kept pace with the growth in average productivity (the average value of goods and services produced in one hour of work in the non-farm business sector). From the early 1970s to the present, the minimum wage has trailed far behind productivity growth. If, instead, the minimum wage had continued to grow in line with a conservative estimate of productivity growth after 1968, by 2013, the minimum wage would stand at more than $17.00. (Less conservative estimates of productivity growth suggest even higher levels.)
In a blog post in 2012, Marie-Eve Augier and I compared the 2011 value of the minimum wage with where it was in 1979, using the cost of health insurance and college as a yardstick. The post contains the full set of results, but to give just one example, it took 254 hours of work to pay a year of tuition at the average four-year public; in 2011, it took 923 hours.
No benchmark is perfect, but it is compelling that all of the most commonly used reference levels suggest that the minimum wage is substantially below its historical level.
That the minimum wage is so low today is particularly striking given that minimum-wage workers are substantially older and better educated than they were in the past (for example, relative to 1979 or 1968).