The basic trajectory of income growth since World War II is now pretty familiar. The Economic Policy Institute’s State of Working America series has tirelessly underscored the sharp divide between shared prosperity before 1979 and skewed prosperity since 1979. And a recent report from the Pew Research charts this out decade by decade to make the same point. This graphic runs those same numbers, charting income growth by quintile, for four different chronological metrics.
Arranged by decade, the results are stark and unsurprising. Two decades of shared prosperity—followed by a decade of slow income growth, two decades of skewed income growth, and a decade (the most recent) of unrelenting recessionary losses.
Arranged by Presidential Administration, we see a similar picture. Again it is striking how strong the income growth is across the board through the end of the Johnson Administration, and how dramatically that collapses into either uneven growth or net losses. The sole exception here is the two Clinton Administrations, in which we see broad, Eisenhower-like income growth. It is easy to make too much of the patterns by Administration, since occupancy of the White House doesn’t necessarily translate into control over policy. But the summary “by party” numbers are revealing. Incomes for the poorest 20 percent grew 80 percent across seven Democratic Administration, and just 7 percent across nine Republican terms.
Arranged by business cycle, the pattern is especially striking. Most of the postwar gains for low- to median-income families are crammed into the single long cycle from April 1960 to December 1969. The pattern of growth across the Reagan (July 1981 to July 1990) and Clinton (July 1990 to March 2001) cycles are essentially similar—as are the pattern of losses across the last two business cycles.
Colin Gordon is a professor and director of Undergraduate Studies, 20th Century U.S. History, at the University of Iowa.
Today's Census release on Health, Income, Poverty, and Inequality in 2011—what I like to call the annual HIPI report—was not particularly surprising. The best news was that the number of people without health insurance declined by 1.3 million, with the largest gain among adults ages 19-25. The credit here largely goes to the Affordable Care Act.
The bad news was that median household income declined 1.5 percent in real terms from 2010. However, the poverty rate and numbers didn't really change. It may seem odd that they didn't move in the same direction—but the big story here was more Americans working full-time, year-round in jobs that don't pay so hot.
The table below shows what happened. Almost 2.2 million more Americans were working year-round, full-time in 2011 than in 2010. But full-time workers were more likely to be working in poorly compensated jobs. As a result, median income for all year-round, full-time workers fell by a stunning 2.5 percent.
Number and Incomes of Full-Time, Year-Round Workers: 2010 and 2011
Median Income ($)
The chart below, from David Johnson's presentation at the Census news conference, shows the change in number of YRFT workers by income quintile.
As this shows, the growth in full-time, year-round workers happened mostly at the bottom (a 17.3 percent increase) and at the top.
Things would be very different, and much more positive for the middle class and working class, if Congress had passed the American Jobs Act proposed by President Obama in 2011. The Jobs Act would have, among other things, created 1.3 million jobs by the end of 2012, including for several hundred thousand construction workers, teachers, and first responders.
PS: Some economists have argued that basic social insurance programs, like the Supplemental Nutritional Assistance Program (SNAP), have operated as a work discincentive. But that claim isn't supported by the data—the increase in FTYR work came even though more low-income families got help making ends meet from SNAP last year.
Our recent CEPR report “Bad Jobs on the Rise” found that between 1979 and 2010, the share of workers in a “bad job” increased from about 18 percent to about 24 percent. (In that report, we defined a bad job as one that pays less than $37,000 a year, lacks employer-provided health insurance, and lacks an employer-sponsored retirement plan. (For more details, see the report or this earlier post.)
The report emphasizes that the increase in bad jobs was especially disconcerting because the U.S. workforce was older and much better educated in 2010 than it had been in 1979. In the figure below, prepared by our CEPR colleague, Milla Sanes, we ask: What would the bad-jobs rate have been in 2010 without this age and educational upgrading of the workforce? The middle line shows the actual path of the bad-jobs rate over the period. The top line traces out the path the economy would have followed had we kept the same age and educationalcharacteristics of the 1979 workforce, combined with the same losses in the economy’s ability to create good jobs. Under these assumptions, about one in three jobs (33.9 percent) would have been a bad job in 2010. The distance between the top two lines represents how much age and educational upgrading of the workforce helped to lower the bad-jobs rate.
(Click for larger version)
We can also ask a related question: What would the bad-jobs rate have been in 2010 if the economy had not lost any of its capacity to generate good jobs between 1979 and 2010? The bottom line in the figure traces out this path. If the economy had sustained the same capacity to produce good jobs that it had in 1979, and the workforce upgrading proceeded as it actually did between 1979 and 2010, the overall bad-jobs rate would have fallen to 14.1 percent in 2010. The distance between this line and the actual bad jobs rate captures just how much the economy has shifted against workers.
After employees at a workplace form a union, their work is really only half-done. The reasons for forming unions vary, but typically include improving working conditions, increasing pay and benefits, or simply having more say in the work environment. The way to do this is by negotiating a legally binding contract that spells these things out. Unfortunately, for those U.S. workers who are able to overcome the obstacles that employers place in front of them when they try to unionize, obtaining a contract appears to be getting more and more difficult.
Though both unions and employers are required to negotiate “in good faith,” the penalties for not doing so are minimal. Usually, of the two parties it’s the employer who drags their feet in negotiations and refuses to bargain in good faith, as can be seen by the filings of “unfair labor practices.” If these charges are found to have merit, the employer is typically just ordered back to negotiations, with no real punishment handed down for breaking the law. With union density – and thus workers’ bargaining power – at its lowest point in the past century, it should come as no surprise that research has shown that nearly half of newly formed unions are unable to negotiate a contract two years after they unionized. U.S. labor law, after doing little to help workers fight strong – and often illegal – employer opposition to their efforts to form unions, also does little to ensure that they are able to obtain a contract.
Canada, on the other hand, has a policy called “first contract arbitration” that provides a way through bargaining impasse. First enacted in 1974, first contract arbitration is now law in most Canadian provinces and covers approximately 85 percent of the workforce. Though researchers have identified four different models, the basic premise is simple: if negotiations have broken down between an employer and a union, either party can apply to the first contract arbitration process. If conciliation and mediation are unable to reach a voluntarily agreed-upon contract, an arbitrator (or arbitration panel) will impose one.
The Employee Free Choice Act would have brought first contract arbitration to the United States. Despite concerns from critics – that it would have discouraged voluntary collective bargaining or resulted in onerous conditions that would have put employers out of business – the experience of first contract arbitration in Canada has shown it to be a successful policy. In addition to workers getting contracts where they might otherwise not, research has shown that first contract arbitration is rarely sought and first contracts are even more rarely imposed, that it results in fewer work stoppages, and that there is no effect on business success or failure. Canadian employers – who first opposed FCA on similar grounds to U.S. employers today – no longer find it to be a controversial policy and often apply to the process themselves. If U.S. policy makers wanted to do something about the decline of the middle class, first contract arbitration would be a good example to follow.
Earlier this week, John Schmitt and I released a CEPR report on the rise of "bad jobs" over the past three decades. The new report is a follow-up to "Where Have All the Good Jobs Gone?", which CEPR released in July. Together, the two papers are like looking at two sides of the same, depressing coin.
We define a bad job as one that pays less than $37,000 per year, lacks employer-provided health insurance, and has no employer-sponsored retirement plan. In 2010, the most recently available data, about 24 percent of U.S. workers were in a bad job, up from 18 percent of workers in 1979. The share of women in bad jobs only increased about 1 percentage point between 1979 and 2010; for men, there was a 10 percentage-point increase over the same period. But, at every point in the last 30 years, women were still more likely than men to be in a bad job.
The increase in bad jobs occurred even as the workforce, on average, became more educated and experienced --just the opposite of what we would not just hope for, but expect, from the economy.
The economy added 96,000 jobs in August, roughly the pace needed to keep even with the growth of the labor force, according to the Bureau of Labor Statistics' latest employment report. The unemployment rate also dropped to 8.1 percent, but this was entirely due to a drop in the labor force as the reported employment in the household survey edged downward. There was more negative news in the household survey: The jobs numbers for June and July were also both revised down by roughly 20,000 bringing the three month average to 94,000, and the employment-to-population ratio (EPOP) dropped 0.2 percentage points to 58.3 percent.
But there were a couple of bright spots. The percentage of unemployment due to people voluntarily quitting their jobs, a measure of workers’ confidence in the labor market, rose to 7.5 percent, putting it near the winter levels. Also, the number of workers involuntarily working part-time, as well as the number of discouraged workers, both fell, pushing the broad U-6 measure of labor market slack to its lowest point since January of 2009. Undoubtedly some seasonal factors depressed August’s job number. While September will likely show a better story, job growth is barely fast enough to keep pace with the growth of labor force.
For a more in-depth analysis, check out the latest Jobs Byte.
My colleague John Schmitt provided an excellent overview of my recent paper about organized labor in the United States and Canada, “Protecting Fundamental Labor Rights: Lessons from Canada for the United States.” In that post, John laid out the basic argument – that there are two key differences between the United States and Canada that have allowed the unionization rate in Canada to remain stable over the past half century while it has plummeted in the United States. The first of these differences is the process by which workers form unions in the two countries, which is what I want to take a closer look at in this blog post. In a second post in the days to come, I'll look at the other key difference – how bargaining impasses are handled after the initial formation of a union.
Both the U.S. and Canada have two different processes for forming unions – mandatory elections and card check. Under mandatory elections, unless an employer voluntarily recognizes a union, employees who want to form a union at their workplace must file a petition for an election with the labor board – a government agency that enforces the collective bargaining laws – showing support of at least 35 percent of the workforce. (It's 35 percent in the United States and 35-45 percent in Canada depending upon the province.) This is usually done with signed authorization cards, and employees or unions on their behalf will typically gather much more support than this before filing the petition, usually around 65 percent. The labor board, after verifying the cards against payroll records of the employer, will schedule and hold an election in which employees can vote on whether or not to form a union.
Under card check, the process is more streamlined. If employees are able to show majority support (ranging from 50-65 percent), the labor board will simply verify the signed authorization cards, and, if there is the required majority of the proposed bargaining unit, the board will certify the union. If there is less than the required level of support, the labor board will schedule and hold an election. In both mandatory elections and card check, once a union has been certified by the labor board, the employer is required to recognize the union and both the employer and the union are required to “bargain in good faith.”
I largely agree with Leonhardt's conclusion as stated: “If you’re trying to understand why every income group except for the affluent has taken an income cut over the last decade, you probably shouldn’t put the minimum wage at the top of your list of causes.”
But, I worry that much of the piece will give readers the wrong impression about the minimum wage as a policy to fight inequality.
(1) That the minimum wage is not *at the top* of the list for explaining inequality does not mean that it is not an *important* determinant of inequality.
The minimum wage is just one of a constellation of policies that have pushed inequality up over the last three decades --high unemployment (not just the Great Recession, but most of the last 30 years excluding 1996 to 2000), declining unionization rates, pro-corporate trade deals, deregulation of many previously well-paying industries, privatization of many state-and-local government jobs, a dysfunctional immigration system, poor enforcement of existing labor laws, and others. The common thread running through all of these policies is that they have all served to undermine the bargaining power of workers relative to their employers. Different policies have had effects on different parts of the workforce (by wage level or race or gender) at different times. But, they have uniformly acted to pull the bottom out of the labor market.
Given all these forces pushing in the same direction, it would be odd that the minimum wage was at the top of the list.
My CEPR colleague, Kris Warner, has a new paper on what we can learn about labor law here in the United States from the experience of our neighbors in Canada. The whole paper is worth a read, but I particularly like two of the graphs. The first shows that Canada and the United States were on a very similar unionization path from about 1920 through the 1960s. At that point, unionization rates in the two countries diverged sharply.
The second graph presents the overall union coverage rates in each of the U.S. states and Canadian provinces. All but one of the Canadian provinces lie entirely above all of the U.S. states. Only New York and Alaska edge out the least unionized Canadian province of Alberta.
Kris emphasizes two important differences between the Canada and the United States. The first is that, in the private sector, it is generally much easier for workers to form a union in Canada, primarily because most workers there can form a legally recognized union based on collecting verified signatures. In the United States, of course, even after workers sign cards asking for recognition, they then face a second hurdle in the form of a National Labor Relations Board election (unless the employer decides to recognize the union based on the collected cards). As John Logan and others have documented (pdf), an entire “union avoidance” industry of consultants and lawyers has arisen in the United States over the last several decades to intimidate workers during the NLRB election process, and this industry has generally been highly effective.
The second key structural difference between Canada and the United States, says Kris, is that Canadian workers can rely on first-contract arbitration to ensure that they will secure a contract after legal recognition. In the United States, even after workers win an election, they only reach a contract in a bit over half cases (see John-Paul Ferguson excellent paper (pdf)).
Along with former Senator Alan Simpson, Erskine Bowles has become known to much of the public as the co-chair of President Obama’s deficit commission. The two of them produced a report that is viewed by many in the media and leading Democrats in Congress as providing the basis for a “Grand Bargain” on a long-term deficit reduction package.
However, in addition to his duties on President Obama’s deficit commission, Erskine Bowles also has a day job. In fact he has many of them. Over the last decade, Mr. Bowles has sat on a large number of corporate boards. This is in addition to serving as the President of the University of North Carolina from 2005 to 2010 and unsuccessful runs for a U.S. Senate seat in North Carolina in 2002 and 2004.
Some of the companies for which Mr. Bowles served as a director have gained considerable notoriety in recent years. He served on the board of Krispy Kreme, the upstart doughnut company that was briefly a Wall Street darling. He also sat on the board of General Motors from June of 2005 until it went into bankruptcy in the spring of 2009. He joined the board of Morgan Stanley, the Wall Street investment bank, near the peak of the housing bubble in December of 2005. He remains on its board today. He also joined the board of Facebook in September of last year.
The paper’s authors, CEPR Senior Economist John Schmitt and Research Assistant Janelle Jones, also wrote a series of posts for the CEPR Blog. John wrote this post on good jobs by education level. John also penned this post, which uses the data to debunk the technological change story. Janelle wrote this post on good jobs and gender. She also wrote this one looking at employment-sponsored retirement plans as well as this one examining employer provided health insurance. University of Iowa History Professor Colin Gordon posted this interactive graph on the CEPR blog summarizing the study’s findings.
CEPR on Honduras CEPR’s Senior Associate for International Policy Alex Main teamed up with Rights Action’s Annie Bird and Karen Spring to release “Collateral Damage of a Drug War,” based on the authors’ investigation in Honduras into the May 11, 2012 deaths of four people in a DEA-related counternarcotics operation in the Moskitia region. The authors conducted extensive interviews with survivors, eyewitnesses, and U.S. and Honduran government officials, finding inconsistencies between survivor accounts and the statements of government officials.
In his RNC speech last night, Rick Santorum claimed that the U.S. poverty rate would be close to zero if all of us here in the land of the free just did three simple things: (1) worked full-time, year-round (for every year of our entire working life); (2) graduated from high school (regardless of the quality of that education), and (3) got married (regardless of the quality of that marriage).
Absent massive government investments in publicly subsidized jobs combined with a federal mandate that Americans never get sick or disabled, it is hard to imagine how Santorum's simple thing number 1 is achievable. (And, if we let Americans in Charles Murray's "lower tribe" have children, then we need big new investments in child care—Santorum, I know is a big fan of the former, to the point of making childbearing mandatory after conception, but not so much the latter.)
So let's just focus on the other two: finishing high school and universal marriage. A quick glance at the educational and marital demographics of poverty are all it takes to dismiss Santorum's arguments here.
First, marriage. As the table below shows, nearly two-thirds of working-age adults (25-64, the same age range used by Santorum) with incomes below the meagre federal poverty line either are currently married or have been married. Over one-in-three are currently married and not separated. So marriage is clearly no panacea.
Working-Age Adults (25-64) with Below-Poverty Income by Marital Status, 2010
Currently or Previously Married
-Currently Married but Separated
-Divorced or Widowed
Second, finishing high school. As the table below shows, the vast majority of adults with below-poverty incomes finished high school, and more than one in three have education beyond that. Some 2.3 million working-age adults with college degrees had below-poverty incomes in 2010.
Working-Age Adults (25-64) with Below-Poverty Income by Educational Attainment, 2010
High School Grad or Higher
High School Grad Only
HS plus Some College
Bachelor's or Higher
No High School Diploma
Millions of married Americans with high school diplomas and beyond live below the poverty line today. Scolds like Santorum deny this reality and blame individual Americans for their economic struggles because they don't want to acknowledge that the real responsibility lies with failed conservative economic policies, and the incredible economic mismanagement of, among others, Robert Rubin, Alan Greenspan, and Ben Bernanke.
I remember being struck several years ago by David Brooks' odd use of the adjective "disorganized" to describe single-parent families. As he put it in the New York Times in 2007: "A human capital agenda ... means preserving low income-tax rates .... [and] creating high-quality preschools for children from disorganized single-parent homes." I'm all for high-quality preschools, although for reasons that have little to do with either home organization or single-parenthood. Personally, I think pre-K should be universal, and shouldn't be limited to children living in single-parent households whose parents aren't able to keep their closets and counters clutter-free. But I guess if, like Brooks, your main goal is preserving low taxes for the 1 percent, you might think this kind of rationing of pre-K is necessary.
I suspected at the time that "disorganized single-parent" was simply a more genteel and NYT-reader-friendly way of saying what Charles Murray once said at a Capitol Hill symposium about single mothers: "There is a dirty little secret about the problem of out-of-wedlock births to poor women. The dirty little secret is that very large numbers of them are rotten mothers." That, of course, is something that one can say in the editorial pages of the Wall Street Journal or other Murdoch papers, but not in the Grey Lady.
The recent work of CEPR’s John Schmitt and Janelle Jones shines a harsh spotlight on the dramatic decline in “good jobs” over the last generation. In Where Have All the Good Jobs Gone?, Schmitt and Jones show that the share of good jobs (defined by an earnings threshold of $18.50/hr and the job-based provision of health coverage and a retirement plan) has fallen—even as the age and educational attainment of the workforce has risen.
The interactive graph below summarizes their findings: Select any combination of demographics (all workers, women, men) in the upper right; and any combination of “good job” elements (earnings, health coverage, retirement plan) in the bottom pane. Start with the earnings threshold for all groups. Here we see decent gains for women, although no more than we might expect given gains in labor productivity and educational attainment over the same span. The share of men at this threshold, by contrast, falls—from about 57.5 percent in 1979 to about 54.5 percent in 2010. Selecting a single demographic underscores the contributing factors. For men, declining pension and health coverage (combined with flat earnings) led to steep decline in the good job share (from 37.5 percent in 1979 to 27.7 percent in 2010). For women, health coverage has fallen less dramatically (from a lower starting point), and pension coverage is pretty flat—and together they have dampened but not erased the gains in earnings (yielding a modest increase in share of good jobs, from 12.4 percent to 21.1 percent).
Sixteen years ago on Wednesday, President Bill Clinton surrendered to House Speaker Newt Gingrich and signed NewtAid, legislation that replaced the Social Security Act's Aid to Families with Dependent Children (AFDC) with a right-wing block grant scheme called Temporary Assistance for Needy Families (TANF). NewtAid/TANF was prematurely lauded as a success before it had even been fully implemented.
It is now clear TANF is a failed program that needs to be overhauled. NewtAid's failure can be seen most simply by comparing the number of children living below federal poverty line in 1992 and 2010. (These years are compared because 2010 is the most recent year we have child poverty numbers for, and 1992 is, like 2010, the first calender year after the end of a recession.)
Number of Children Living in Families with Incomes Below the Federal Poverty Line
1992: 15.3 million
2010: 16.4 million
In sum, just over 1 million more children lived below the poverty line in 2010, more than a decade after TANF's implemetation, than before TANF's implementation in 1992. If AFDC had remained in place and reformed along progressive lines, the number of children living in poverty would be much lower today than it was before NewtAid.
AFDC was far from a perfect program—especially after Reagan-era budget cuts limited the support it provided for working parents—but it was one of the dependable pillars of our system of social insurance (although nowhere near as strong a pillar as Social Security). Instead of strengthening AFDC, NewtAid tore it down and replaced it with a radical conservative scheme that has:
Given states incentives to spend billions of dollars in public funds—funds that had previously been used to promote economic security and opportunity for low-income parents—in an unaccountable and often irresponsible manner. In fact, the bulk of public funds available to states under TANF today are not spent on child care, employment services, or helping families meet basic needs. Instead, states have diverted the bulk of funds to "other services." In the most notorious cases, states have diverted TANF funds to finance unaffordable tax cuts.
Failed to provide adequate information on how states use TANF funds. As GAO has found, we know very little about how states are using 70 percent of the funds provided by the $16.5 billion block grant, including not just "results" but things as basic as how many families are being helped with these funds.
Deeply cut the actual amount of resources available through AFDC/TANF to help struggling, working-class families. Because block grant funding has remained frozen at its 1997 level, its actual value (adjusted for inflation) has fallen by nearly 30 percent.
Of course the HAMP and various derivative programs were complicated. They were made even more so by the fact that mortgage servicers really weren't set up to do modifications. The servicer process was set up to minimize costs by establishing standardized procedures. These outfits knew how to collect monthly payments, send out late notices, start default proceedings, and carry through foreclosures. They had specified payments for each step in this process. Working out loan modifications was not on the list, so it's not surprising that they didn't do a very good job, especially when we factor in complications like second mortgages.
There was an easier route. Early on in the crisis (August of 2007) I proposed the first version of my Right to Rent plan. The basic point was very simple. It gave homeowners the option to stay in their home as renters for a designated period of time following foreclosure (at least 5 years in my ideal world) paying the market rent.
This would accomplish several important goals. First and foremost it would instantly give homeowners facing foreclosure a substantial degree of housing security. If they had kids in school, the rental period would likely be long enough to let them finish. It would also give them some time to get on their feet financially and make arrangements for appropriate housing.
Second, it would prevent a blight of foreclosures from ruining whole neighborhoods. Renters who effectively have long-term leases have almost as much incentive to maintain the property as owners. At the least they would keep homes occupied so that they would not be eyesores and possible havens for crime.
Third, this right would give banks more incentive to find ways to modify mortgages to keep people in their homes as owners. Banks would generally prefer having a foreclosed house free and clear rather than being a landlord for 5 years. By making the foreclosure option less attractive, Right to Rent would make banks more willing to consider alternatives.
Right to Rent would not have required any massive bureaucracy. It could have been handled within the same legal framework in which foreclosures were handled. The main cost would be the payment for appraisers who would determine the market rent on a foreclosed home. This is the same process that appraisers follow when determining the market price of a home for someone seeking a mortgage.
Right to Rent also raised few of the moral hazard or political issues associated with various plans to have the government pay down debt. The former homeowner hardly gets a bonanza in this story. They get a right to stay in their home that they would not have otherwise had, in recognition of the extraordinary circumstances homeowners faced in the peak years of the housing bubble.
I was impressed that many moderate conservatives considered Right to Rent a reasonable solution to the housing crisis such as former Bush administration economist Andrew Samwick and Desmond Lachmon, an economist at the American Enterprise Institute. Even Fox business anchor Neil Cavuto sent me a note saying that he agreed with the idea.
Of course President Obama would have had to get Congress to pass such a measure. Would they have done it in early 2009? Who knows, there was a huge amount of goodwill directed toward the new president, who had sky high approval ratings at the time. Certainly it would have been difficult to push a Santelli type rant-fest against Right to Rent. After all, there were no taxpayer dollars involved.
In this area, as in so many others, the Obama administration was unwilling to really push anything new. Perhaps Right to Rent wouldn't have worked (we can still try it), but it certainly could not have done worse than the alternatives.