Sign in or register
for additional privileges

Growing Apart

A Political History of American Inequality

Colin Gordon, Author

You appear to be using an older verion of Internet Explorer. For the best experience please upgrade your IE version or switch to a another web browser.

The Bare Minimum: Labor Standards and American Inequality

American inequality is rooted in wage inequality—and in the policies that shape the labor market. The minimum wage, in this respect, is an important and foundational policy, especially as other mechanisms for sustaining bargaining power or bidding up wages have withered. Labor standards set both an economic and ethical floor. But, on both scores, the current minimum wage offers a pretty shaky floor. Its coverage is uneven. Enforcement of wage and hour standards in the United States is weak (and weakening). And the value of the American minimum wage—in historical and international terms—is meager.


A Short History of the Minimum Wage

Early efforts to establish minimum wages in the United States were part of a broader response to industrialization, urbanization, and mass immigration in the first decades of the twentieth century. Progressive reformers were broadly concerned with compensation and working conditions, but also hoped that higher standards would protect white male breadwinners from the “wearing competition” of those (women, children, immigrants, African Americans) who “unfairly dragged down the wages of more deserving workers.”  

In 1912, Massachusetts led the way with a law that calibrated the minimum to a basic standard of living, with allowances for the competitive conditions of particular occupations and industries. The next year, eight other states (Wisconsin, Minnesota, Washington, Oregon, California, Colorado, Nebraska, and Utah) followed suit—some on the Massachusetts model, others introducing new provisions or enforcement standards. But, in all of these settings, the law applied only to women and minors—reflecting both the goal of easing such workers out of the labor market, and the assumption that men did not require such paternal protection.

The reach of these laws, furthermore, was sharply curtailed by the courts. In Lochner v. New York (1905), the Supreme Court struck down a New York maximum hours law on the grounds that it “restricted the worker’s right to set the price for his own labor." In Hammer v. Dagenhart (1918), the Court chilled federal action, ruling that the commerce clause proscribed federal regulation of child labor in the states. And in Adkins v. Children’s Hospital (1923), the Court reaffirmed the “freedom of contract” implied by Lochner, invalidating a District of Columbia law that set minimum wages for women.

It was not until the onset of the Great Depression that national politics again tested these constraints. The New Deal began experimenting with wage regulation under the President’s Reemployment Agreement (1933), which asked employers to sign pledges, eventually covering some 16 million workers, that shortened the work week and set a minimum wage of forty cents/hour (in inflation-adjusted terms, pretty close to the current minimum of $7.25). This was followed, and formalized, by the National Industrial Recovery Act (NIRA) whose “codes of fair competition” allowed the federal government to set minimum wages and maximum hours by industry.

The Supreme Court remained skeptical. In 1935, it struck down the NIRA (echoing Dagenhart) as a violation of the commerce clause. A year later, in Morehead v. New York, the Court harkened back to Lochner when it invalidated a state minimum wage law as an affront to the freedom of contract. In this tenuous legal climate, the New Deal moved cautiously. The Walsh-Healy Act (1936) set “prevailing wage” labor standards for all government contracts, extending the reach of the 1931 Davis-Bacon Act, which did the same for public works. In his 1936 campaign, Roosevelt pledged to push for more general wage and labor standards. And in 1937, the Court reversed course—ruling, in West Coast Hotel Company v. Parrish, that freedom of contract could be regulated or restrained when larger social goals were at stake or when the standing of the parties to the contract were starkly unequal [for more on the New Deal minimum wage cases].

This set the stage for the passage of the Fair Labor Standards Act (FLSA) of 1938, which banned most child labor, established a maximum workweek of forty-four hours, and set a minimum hourly wage of twenty-five cents. The wage law was seen as both a recovery measure and, coming on the heels of the National Labor Relations Act (also upheld by the Supreme Court in 1937), as an essential complement to the protection of labor’s rights. The minimum wage, argued the Administration, would "underpin the whole wage structure. . . . [to] a point from which collective bargaining could take over."

But if the courts had conceded to the justice and logic of basic labor standards, the South dug in its heels. Leery of the implications of universal or national standards, southerners in Congress won sweeping exemptions: the FLSA would not cover agricultural workers, domestic workers, or employees in retail or service firms doing less than half of their business in interstate commerce. At is passage, the FLSA reached only about a fifth of the labor force. And the task of setting the minimum wage, vested in professional wage commissions in most other countries, was left with Congress. The wage would slide, as inflation undercut its purchasing power, until Congress deigned to add another dime or quarter to the base rate.

The limits of the FLSA were apparent after 1938, as coverage actually narrowed (with new exemptions for laundries, cleaning, and tailoring in 1945) and the wage itself inched up, to seventy-five cents/hour in 1949, and one dollar/hour (about $6.25 in 2013 dollars) in 1950. Not until the 1960s did Congress revisit the exempted occupations—adding coverage for large retailers and all interstate employers in 1961, and rolling in some agricultural and some public workers in 1966.

After the late 1960s, legislative battles were increasingly colored by the powerful (if false) assumption that any increase in the minimum wage’s rate or coverage came at the expense of employers and displaced low-wage workers. Periodic increases were made necessary by persistent inflation, but still did not keep pace. Nixon vetoed an effort to expand coverage in 1973, but relented to Congressional pressure a year later—finally extending the FLSA’s reach to local and state public employees. Looking to slash business costs in any way he could, Reagan declared the minimum wage “public enemy number one” and allowed a full decade to pass between increases—during which time the minimum lost over a quarter of its value in real dollars. For its part, the Clinton administration pushed through two increases, but only by giving in on the tax side: “Since we know that a minimum wage increase kills jobs,” reasoned then–House Speaker Newt Gingrich, “there ought to be a package that includes other things that create more jobs to make up.”

Both the establishment of the federal minimum late in the New Deal, and every subsequent legislative battle over increasing it, have been marked by grave concerns about interfering with markets or freedom of contract, and by dire predictions that each increase would drive business into bankruptcy and workers into the soup lines. But a raft of recent research has put to rest the old saw that a higher minimum wage kills jobs or repels investment. Because a majority of minimum-wage earners work in outsourcing-resistant service jobs, businesses will have a hard time handing out pink slips. Most low-wage workers, in turn, work not for vulnerable small businesses, but for large—and in recent years quite profitable—corporations. And the ancillary benefits of raising the floor—including boosts to productivity and purchasing power—are more substantial and lasting than any direct employment effects.

Labor Standards and American Inequality


The minimum wage, part of a bundle of New Deal policies that sustained the bargaining power of workers, dampens inequality by maintaining a floor under wages. But its effectiveness depends on its real value, its reach, and its enforcement. Most importantly, the real value of the minimum has declined dramatically over the last generation [see graphic below]. In the first thirty years after the passage of the FLSA, Congress raised the minimum wage eight times, pushing it over $9.00/hour (in 2013 dollars) in 1969. Since then, the dollar value of the federal minimum wage has been bumped another fourteen times, but has failed to keep pace with inflation—and fallen substantially in real value.

The minimum’s falling value is perhaps best underscored by benchmarking it against other economic measures and trends. Only at its peak in the late 1960s was the minimum wage sufficient to lift a family of three, with one full-time worker, above the poverty line. The supplemental poverty level, an alternative measure that corrects some of gaps in the older poverty measurement [for more on the limits of the poverty threshold], puts minimum wage workers and their families even further behind. And a living wage threshold (accounting for the actual cost of living across the country) would suggest a minimum wage three or four times the current level. (In Des Moines, Iowa, the living wage for a family of three is $26.96.) If we calibrate the minimum wage to productivity, or to the growth of the economy, it would be between $15.00 and $22.00 today—depending on how productivity is calculated or indexed.

The takeaway from all of this is simple: Even by the most meager benchmarks, the minimum wage should be at least $10.00/hour (in the realm of recent proposals by Obama and Congressional Democrats). Benchmarks that actually sustain the value of the minimum or tie it to economic growth over time come in at close to twice that. And those that tie the minimum wage to the actual cost of living for working families run three times that or more [for a more detailed look at "living wages"]. Even as low-wage workers are more productive, more experienced, and better educated than ever, their share of economic rewards is shrinking.

The lapse in value is compounded by gaps in coverage. Large groups of low-wage workers (including domestic workers, home care workers, farmworkers, and seasonal workers) were exempted from coverage under the FLSA in 1938—a pattern of exclusion animated by both the deeply gendered nature of social citizenship in the United States and by the “Jim Crow” concessions to local (southern) power that marked the social policy innovations of the 1930s and 1940s. And while some of those holes have been patched (most notably with the recent extension of FLSA coverage to many home care workers, and the proposal to extend overtime coverage to many low-wage salaried workers), the protections afforded by the FLSA are still ragged where they are needed most.

The weakness of American wage and hour standards is compounded by meager enforcement of even the most basic provisions of state and federal law. Recent work on the prevalence of wage theft has documented a wide array range of FLSA violations—including piece or day labor rates that fall short of the minimum, failure to pay overtime, illegal or excessive deductions from paychecks, willful misclassification of workers as “independent contractors,” and various petty gambits for stealing time (such as unpaid breaks or forcing workers to don safety gear off the clock). In short, federal labor standards are too low, they don’t reach enough workers, and they are often flouted where they do apply [for more on wage theft].

So what difference would it make if the minimum wage were higher, broader in its coverage, and more rigorously enforced? The ability of a higher minimum to dampen inequality is now well established. Research on this question has traced the impact of minimum wage increases across  jurisdictions (looking at employment and income patterns in neighboring states, cities, or counties with different minimum wage rates), over time (looking to employment and income patterns in the years after an increase), and across the demographics of those workers earning at or near the minimum.  

In the United States, the minimum wage held up the lower “tail” of the earnings distribution into the 1970s, but—as its real value plummeted—so too did the fortunes of low-wage workers. The falling minimum widened the gap between low-wage and median-wage workers through the 1970s and 1980s, accounting for about a third of the growing gap between low- and median-wages for women, and about a fifth of the same gap for men.


Looking at family income patterns after state or federal increases in the minimum rate (from 1990 to 2012), Arindrajit Dube finds each ten-percent increase (from $7.25 to $8.00, for example) yielding about a 3.5 percent decline in the federal poverty rate. If we project the impact of an increase to $10.10/hour across all workers currently earning less than that, we find that seventy percent of the benefit would go to families earning less than $60,000/year and nearly a quarter of the benefit would go to those earning less than $20,000.

The workers that would gain from an increase scarcely resemble the suburban teenage hamburger-flippers so often invoked by conservatives. The graphic below summarizes the distribution (across various measures) of all hourly workers and of hourly workers paid at or below the minimum wage in 2012. Toggling between the two universes of workers underscores those demographics, occupations, and industries in which low-wage workers are overrepresented.



Unsurprisingly, a lot of minimum wage workers are young. But over three-quarters of those working at or below the minimum are not teenagers. Indeed almost 90 percent of those who would benefit from an increase (a group that includes those working below, at, or near the minimum) are at least twenty years old. Almost 45 percent of minimum or sub-minimum-wage workers have better than a high school education. And about 42 percent work more than thirty hours a week. Raising the minimum wage would have a sizable impact in those corners of the economy (fast food, retail) in which low-wage, no-benefit employment is now the dominant business model. The leisure and hospitality industry accounts for just 13 percent of hourly employment but over half of all minimum wage employment. Food preparation and service account for less than 10 percent of hourly employment, but nearly 44 percent of workers earning at or below the minimum.

For women, overrepresented in both low-wage occupations and exempted occupations, the minimum wage is especially important. Adult women are roughly twice as likely as men to be paid at the minimum, and make up the single biggest cohort of minimum wage workers. Women workers lost the most ground as the minimum wage slipped in value—and they would gain the most from any increase.

American Labor Standards in International Perspective


All of this puts the United States well behind its international peers. In an era when most other industrial democracies were forging ahead with minimum wage laws of broad coverage, American efforts were constrained to a few states, aimed at a few workers, and routinely disdained by the courts. While professional wage commissions set minimum wage rates in most other comparable countries, FLSA coverage and rates remain subject to the whims of Congress. As a result, the American minimum wage hits a lower target and covers a smaller share of its workforce than those in most of our peer countries.

The graphic below plots national minimum rates across the OECD, both as a share of each country’s median and average wage and in real U.S. dollars. As a share of median or average wages, the U.S. minimum trails the pack—well behind our richer peers, and most of the poorer cousins in the OECD as well. At the U.S. exchange rate, the U.S. minimum trails all the other rich countries; and at U.S. purchasing power (a more stable measure) it trails all but Japan in that group.


The most damaging difference, in this respect, is the yawning gap between the U.S. minimum and the threshold (two-thirds of median) for low-wage work. In most European settings, the minimum wage (set by national law or collective bargaining) runs close to that threshold and—as a result—actually has an impact on the incidence of low-wage work. In the United States, the minimum wage is now too low to even play this role.  More broadly, the U.S. lags well behind its peers on a wide array of labor market policies--a fact evident in both the broad sweep of our history, and in the response to the Great Recession.  

Next: The Perils of Private Welfare

Comment on this page
 

Discussion of "The Bare Minimum: Labor Standards and American Inequality"

Add your voice to this discussion.

Checking your signed in status ...

Previous page on path Differences that Matter, page 2 of 8 Next page on path