2015 Outstanding Academic Title, Choice
Watch Marianne Cooper, author of Cut Adrift, in conversation with Sheryl Sandberg at the Commonwealth Club.
From Shared Prosperity to the Age of Insecurity
How We Got Here
I have pointed out to the Congress that we are seeking to find the way once more to well-known, long-established, but to some degree forgotten ideals and values. We seek the security of the men, women, and children of the nation. That security involves . . . [using] the agencies of government to assist in the establishment of means to provide sound and adequate protection against the vicissitudes of modern life-in other words, social insurance.
Franklin Delano Roosevelt fireside chat, June 28, 1934
Many of our most fundamental systems-the tax code, health coverage, pension plans, worker training-were created for the world of yesterday, not tomorrow. We will transform these systems so that all citizens are equipped, prepared, and thus truly free to make your own choices and pursue your own dream. Another priority for a new term is to build an ownership society, because ownership brings security and dignity and independence. In all these proposals, we seek to provide not just a government program but a path, a path to greater opportunity, more freedom, and more control over your own life.
George W. Bush, acceptance speech at the Republican National Convention, September 2, 2004
Tales of families, particularly middle-class and working-class families, experiencing upheavals and setbacks because of job losses, health care emergencies, and stagnating wages have become increasingly common. Economic uncertainty has always existed, of course, but the breadth and depth of the problem in twenty-first-century America is alarming. In fact, until relatively recently many people assumed that secure jobs, rising incomes, and upward mobility were an inherent part of American society. Times have changed.
Age of Security: The New Deal Era
In the broadest historical perspective, today's widespread sense of economic uncertainty is not so much a new phenomenon as a regression to an older state many Americans believed had been safely left behind. Before the Great Depression most jobs in America were precarious: wages were unstable, pensions and health insurance were unheard-of, and labor laws were almost nonexistent. However, in the wake of that economic calamity, a new social ethic emerged that sought to provide Americans with greater security in both good times and bad.
Beginning in the 1930s, the federal government took a more active and formal role in protecting Americans from the "hazards and vicissitudes of life" in modern capitalism. Throughout that decade laws were enacted to govern working hours and establish minimum wage levels. And with the passage of the Social Security Act and the Wagner Act in 1935, Americans gained access to old-age and unemployment insurance and the right to counteract the power of employers through collective bargaining. These kinds of laws dramatically expanded the number of workers with secure jobs, living wages, and robust benefits.
Through large-scale social insurance programs like those grouped under the rubrics of the New Deal (in the 1930s) and the Great Society (in the 1960s), presidents from Franklin D. Roosevelt through Gerald Ford sought to shelter Americans from economic ups and downs by expanding programs like unemployment insurance and disability benefits and by providing health insurance through Medicare and Medicaid to the elderly and the poor. Many private employers, spurred on by a powerful labor movement, also came to embrace the collectivist approach put in motion by the New Deal; they offered workers good wages and health, disability, and pension benefits as a way of rewarding them for their hard work (and as a way of fending off more intrusive government intervention in the private labor market).
Over time, an unspoken agreement was struck among government, labor, and big business that shaped a hybrid public/private system for providing security and prosperity to tens of millions of Americans. From the 1930s through the 1970s, the government took responsibility for tempering the effects of the business cycle through economic policies based on the theories of the twentieth-century British economist John Maynard Keynes, coordinating national monetary policy and fiscal policies to minimize the depth and duration of recessions. It also helped train young people for jobs (through public support for colleges and universities, subsidized student loans, and scholarship programs) and helped stimulate mass consumption by supplying subsidies for housing and funds to develop a national highway system. Yet the government was careful not to go too far, abstaining from centralized planning or meddling in corporate decision making. For their part, workers tacitly agreed to minimize labor unrest in return for promises by companies to provide stable employment, good benefits, and wage increases that reflected workers' fair share of the profits gained through rising productivity.
Though this three-way social contract among government, employers, and workers varied in its effectiveness and excluded many women and minorities, it nevertheless enabled several generations of Americans to prosper. Countless statistics tell the story. Here are just a few. From 1950 to 1970, the yearly income of the median worker more than doubled, and those at the bottom of the earnings distribution saw their earnings increase even more. Family income increased by 56 percent between World War II and the mid-1960s. There was an upgrading of the entire employment structure in the 1960s, with strong employment growth in middle- and high-wage jobs and only modest expansion in low-wage jobs. From the mid-1940s to the late 1960s, America became a more equal society as family income inequality decreased by 7.5 percent. Pension and health care coverage were on the rise. At the peak of this system, in the late 1970s, private pensions covered 40 million people-49 percent of private wage and salary workers-while private health care coverage reached more than 80 percent of Americans. Affordable housing became more available and home ownership almost doubled, growing from seventeen to thirty-three million. Thanks to government initiatives like the G.I. Bill (1944) and the Higher Education Act (1965), 2.3 million veterans went to college and the number of low-income students attending universities nearly doubled between 1965 and 1971.
Collectively, these forces helped to create a large and thriving middle class whose growing wealth stimulated decades of unprecedented economic expansion. President John F. Kennedy's words "A rising tide lifts all the boats" summarized the experience of hundreds of millions of Americans, whose gains during the New Deal and after World War II made economic optimism seem inevitable and permanent.
From the 1970s to Today: Stagnation, Inequality, Insecurity
At first almost imperceptibly, then with growing force, the economic tides began to shift away from growth, security, and shared prosperity for Americans in the 1970s. Lurking behind the bad news were potent transformations that would gradually alter the dream of universal progress that Americans had come to consider their birthright.
Once again, the numbers map the trends. In the early 1970s, median earnings began to stall; by the 1990s, a considerable number of workers were earning less than their counterparts had decades earlier. By 1996, real wages for the workers at the bottom had fallen about 13 percent, and real wages for workers in the middle had fallen by close to 10 percent.
Men, especially those with less education, have been particularly affected. Between 1969 and 2009, men's median annual earnings decreased by 14 percent. Among men with only a high school diploma, median annual earnings declined even more, falling by 47 percent over the same period. Other troubling developments have been the decline in the number of men working full-time and a rise in the number of men with no formal labor-market earnings at all. Between 1960 and 2009 the share of men working full-time decreased from 83 percent to 66 percent and the share of men with no formal labor-market earnings increased from 6 percent to 18 percent.
Women have fared better. Spurred on by the women's movement, a desire to work, and the financial needs of their families, women have poured into the labor force over the last fifty years. Given how low women's wages were several decades ago and how many women are now working, women's earnings have dramatically increased, rising 56 percent (for the median full-time female worker) since the early 1960s. However, women's wages, too, have recently plateaued. Since 2001, median earnings for women have mostly stagnated.
When we look at individual male and female workers over time the numbers are somewhat bleak. From a household perspective, however, economic well-being looks a bit better. Between 1975 and 2009, the median wages for two-parent families increased by 23 percent. However, this increase has been fueled by parents, mostly mothers, working more hours, not by rising wages. On average the typical two-parent family now works 26 percent longer or seven hundred more hours per year than the typical two-parent family did in 1975. If women had stayed home, middle-class incomes would only have grown by about a quarter as much as they did between 1979 and 2000, while low-income families would have seen a significant decrease in real income. The consecutive recessions of 2001 and late 2007/2008 have put downward pressure on family household incomes. Since 2007, median family household income has declined by 8.4 percent. In real terms, median family household income has returned to 1996 levels.
Another troubling sign that began to emerge in the 1970s is the failure of most families to realize economic gains from rising productivity. Although median family income and productivity grew in tandem at 2 to 3 percent a year in the immediate post-World War II years, from 1973 to 2005 median family income grew at less than one-third the rate of increases in productivity. Thus even though American workers are producing more goods and services per hour, they have not been rewarded for it. Instead, most of the gains from increased productivity have gone to executive compensation and corporate profits.
With the top receiving most of the gains in both income and wealth, economic inequality has risen drastically. During the 1980s-the worst decade by this measure-workers earning the least saw their wages decline by 14 percent, workers earning the most saw their wages increase by 8 percent, and the wages of workers in the middle remained flat, reflecting a widening income gap from the top to the bottom of the scale.
Since the 1990s, a different pattern has emerged, with inequality growing between the highest earners and those in the middle while it decreases between earners in the middle and those at the bottom. Inequality has thus grown because the growth in the incomes of the highest-earning Americans have outstripped those of both the middle class and the poor. From 1976 to 2005, for example, the growth of post-tax income among the poorest households was just 6 percent, while among middle-income households it was 21 percent (less than 1 percent a year) and among the top fifth of households it was 80 percent.
Furthermore, recent evidence finds that the growth in wage inequality is becoming more concentrated at the top. The wage gap between those with graduate degrees and those with college degrees only has grown more than the gap between those with college degrees and those with only a high school diploma.
Wealth inequality has grown even faster than income inequality. In the beginning of the 1960s, the wealthiest fifth of all U.S. households held 125 times more wealth than the median wealth holder. By 2004, the ratio had increased to 190 to 1. From 1984 to 2004, the top 20 percent of households received 89 percent of the total growth in wealth, while the bottom 80 percent received just 11 percent. By 2010, median wealth in the United States reached its lowest point since 1969. In that year, the top fifth of households held almost 90 percent of all wealth, households in the middle held 12 percent, and households at the bottom had a negative net worth-they owed 0.9 percent of all wealth.
Taken together, the growth in both income inequality and wealth inequality since the postwar years has led some to conclude that the United States has not seen such high levels of inequality since just before the Great Depression. Among all the industrialized countries in the world, the United States is now one of the most unequal.
Other developments loom darkly over the economic status of American families, starting with work insecurity. Studies have found a decrease in employment stability, especially for men. For example, the average job tenure for men working in the private sector at age fifty has declined from 13.5 years in the 1973-83 period to 11.3 years in the 1996-2008 period. Job losses often result in long-lasting economic setbacks. Research shows that today's displaced workers can suffer from prolonged periods of unemployment and that once they find new jobs, their earnings are often substantially lower than their previous earnings. A 2013 survey of more than a thousand American workers found that among those who had gotten a job after being laid off during the recession, 54 percent earned less in their new jobs.
Other research has found that economic volatility has risen even faster than inequality. By 2003, the rate of income instability (as measured by drops in income) was three times greater than in the early 1970s. To be sure, the number of Americas experiencing economic difficulties without having sufficient financial resources to weather the storm has steadily increased, from 14.3 percent in 1986, to 18.8 percent in the early 2000s, to 20.5 percent during the Great Recession. Furthermore, when family income drops, it now drops much further than it did in the past. In the early 1970s, the usual loss was around 25 percent of a family's previous income; by the late 1990s the loss had grown to 40 percent. A report released by the Rockefeller Foundation estimated that in 2009 the level of economic insecurity was greater than at any other time in the last twenty-five years, with about one in five Americans experiencing a decrease in household income of 25 percent or more. With only about half of Americans equipped with savings to cover living expenses for three months, the increase in economic volatility puts many families on the edge of insolvency.
Other signs of economic insecurity have risen as well. The number of filings for personal bankruptcy grew from around 300,000 in 1981 to about 1.5 million in 2004 and 2 million in 2005. (The latter figure reflected a rush of people filing before a new bankruptcy law went into effect that made filing for chapter 7 bankruptcy, which wipes away most debts, more restrictive and more expensive.) Recently there has been another uptick in the number of bankruptcy filings (more than 1.5 million in 2010 and about 1.2 million in 2012, up from 775,000 in 2007). And experts believe that even these numbers are artificially low, since many who would like to file cannot afford the legal costs involved.
The housing market reflects similar trends. Foreclosure rates tripled from the early 1970s to the early 2000s and have skyrocketed in recent years because of the subprime mortgage crisis. From 2007 to 2011 there were more than 4 million foreclosures nationwide. A 2012 report from the Center for Responsible Lending noted that an average of five hundred families in California have lost their homes every day since the beginning of the Great Recession.
Finally, levels of indebtedness have reached record highs in the United States. In 2004, debt held by households was equal to 80 percent of GDP, up from 50 percent in 1980. For almost every group of households in the United States, the ratio of mortgage debt to income has doubled since 1989, and for many the ratio of total debt to income has also more than doubled. The median value of debt held by American families increased sharply between 1989 and 2007, rising from $24,000 to $67,300. The percentage of American households reporting debt payments that exceed 40 percent of their income grew from 10 percent in 1989 to about 14 percent in 2010. In 2007, 46 percent of American families carried a balance on their credit card, with the average balance having increased by 30 percent from 2004 to $7,300. By 2010, as access to credit tightened, fewer families carried credit card debt (39 percent), and the average balance had fallen a bit to $7,100.
These higher levels of debt are linked with rising costs. Housing leads the way. Karen Dynan, a researcher with the Brookings Institution, argues that much of the increase in aggregate household debt, from about 0.6 times personal income during the 1960s through the mid-1980s to close to 1.2 times personal income in recent years, can be linked to bigger mortgages taken out in response to rising home prices. Since 1975, the proportion of middle-class Americans who might be considered "house poor" because they spend more than 40 percent of their earnings on housing has quadrupled. Many families seeking to send their children to good schools have had to spend a larger proportion of their income to afford increasingly expensive homes in coveted public school districts.
Health care costs are on the rise, too. Between 2000 and 2007, the average annual premium for job-based family health coverage rose more than 90 percent (from $6,351 to $12,106). The average worker's share of this family premium rose from $1,656 to $3,281, an increase of more than 98 percent. By 2012, average annual family health insurance premiums rose to $15,745, and families were responsible for 28 percent of the cost, or about $4,409.
The cost of education continues to soar. In the late 1970s, the average tuition cost for a state college was just over $1,900; by 2007-2008 it had risen to $6,185 (in constant dollars). By 2008, about two-thirds of college graduates took on debt to finance their educations. Upon graduating, they owed on average about $20,000 if they attended a public university and close to $28,000 if they attended a nonprofit private university. In 2010, almost one in five U.S. households held student loan debt, a share that more than doubled since 1989.
In addition to the high costs of housing, health care, and education, families with young children also have large child care bills. In almost half the states in this country, the cost to send a four-year-old to day care exceeds 10 percent of the median income for a two-parent family. In 2011, the average annual cost for an infant to attend a center-based child care program cost more than a year's tuition and fees at public universities in thirty-five states.
Similar statistics could be cited for several more pages, but the pattern is clear-and very troubling. From the 1970s to today, income stagnation, growing inequality, increasing economic instability, soaring debt, and rising costs have steadily eroded the well-being of American families.
Behind the Reversal of Fortune
It is clear that American families have been struggling in recent decades. Less obvious are the forces that are responsible for this reversal of fortune. However, a significant body of research now points to a confluence of economic and social trends that many scholars agree have played a crucial role in the rise of family insecurity.
The Rise of the Service Economy
Since the 1970s, work in the United States has undergone a dramatic transformation-a regression from the New Deal quest for stability and security to a state in which work is precarious. In the words of sociologist Arne L. Kalleberg, work has become more "uncertain, unpredictable, and risky from the point of view of the worker."
One reason for the rise of precarious work is the wholesale restructuring of the American economy from one based on manufacturing to one based on services. After World War II the manufacturing sector comprised 40 percent of the labor force; by 2005, that share had fallen to only 12 percent. The service sector now comprises about 80 percent of the jobs in the United States. Durable manufacturing jobs (autoworker, machinist, chemical engineer) offering higher wages and good benefits have been replaced by service sector jobs (store clerk, cashier, home health care aide) that pay less, offer few or no benefits, and are more insecure.
Moreover, while the manufacturing sector tends to create good jobs at every employment level, the service sector tends to create a relatively small number of high-skill, high-paying jobs (in fields like finance, consulting, and medicine) along with a large number of low-skill, low-paid jobs (in retailing, child care, and hospitality). The result is that secure, semiskilled middle-income jobs like those that once fueled the rapid expansion of the American middle class are increasingly hard to find.
The Impact of Globalization
Beginning in the mid- to late 1970s, U.S. firms began to face dramatically increased competition from around the world. To compete, American companies sought to lower labor costs, in part by outsourcing work to lower-wage countries. Technological advances aided this outsourcing process, as the growth in electronic tools for communication and information management meant that goods, services, and people could be coordinated and controlled from anywhere around the globe, enabling businesses to more easily move their operations to exploit cheap labor sources abroad.
Perhaps the most far-reaching effect of globalization has been a renegotiation of the unwritten social contract between American employers and employees. Managers now demand greater flexibility to quickly adapt and survive in what they see as an increasingly competitive global marketplace. In this context, the traditional employment relationship, in which work is steady and full-time, workers are rarely fired except for incompetence, working conditions are generally predictable and fair (often defined by union-negotiated contracts), and good employees can expect to climb a lifetime career ladder in the service of one employer, has come to seem unrealistic and onerous to business leaders. Today that traditional arrangement has largely disappeared, replaced by nonstandard, part-time, contract, and contingent work, generally offering reduced wages and scanty benefits. Mass layoffs are no longer an option of last resort but rather a key restructuring strategy used to increase short-term profits by reducing labor costs in both good times and bad.
The Decline of Unions
In this new environment, unions are struggling. Although manufacturing workers have a long history of labor organizing, service economy workers such as restaurant and retail employees do not, making it harder for service employee unions to grow. Moreover, globalization, technological changes, and the spread of flexible work arrangements have combined to enable employers to make an end run around unions by moving jobs to countries or parts of the United States where antiunion attitudes and legal regimes predominate. As a consequence of these developments, union membership has steadily declined. In 1954, at the peak of union membership, 28 percent of employed workers were in unions. By 1983, only 20 percent of workers were union members. In 2012, union membership reached a historical low, with membership comprising only 11 percent of American workers. Among full-time workers, the median weekly earnings for union members is $943, while among nonunion workers the median weekly earnings is $742. The decline of unions has severely curtailed and diminished workers' ability to collectively bargain to maintain high wages and good benefits, indirectly fueling a steady decline in the value of the minimum wage. Moreover, the decline of unions has eroded a broader moral commitment to fair pay, which even nonunion workers previously benefited from.
Together, the rise of the service economy, globalization, the decline of unions, and the erosion of the old work contract between employers and employees have created a precarious work environment for more and more Americans. Between the 1980s and 2004, more than 30 million full-time workers lost their jobs involuntarily. And during the Great Recession of 2008-2009, another 8.9 million jobs were lost. In the past few years, long-term unemployment has reached levels not seen since the government began monitoring rates of joblessness after World War II.
Risk Shifts to the Individual
Over the last several decades, both government policy and private sector labor relations have evolved to reduce the sharing of the economic risks involved in managing lives, caring for families, and safeguarding futures. Instead, individual Americans are increasingly being asked to plan for and guarantee their own educations, health care, and retirements. If today's families want a safety net to catch them when they fall, they need to weave their own.
Underlying this shift in risk is neoliberal political ideology, often identified with leaders like Ronald Reagan and Margaret Thatcher, which holds that people will work harder and make better decisions if they must defend themselves against the vicissitudes of life. Neoliberal doctrine views dependence in a negative light (arguing that "coddling" by government undermines individual initiative) and actually celebrates risk and uncertainty as sources of self-reliance. In this new paradigm, the individual is encouraged to gain greater control over his or her life by making personal risk-management choices within the free market (and living with the consequences of any misjudgments). In this "ownership society," individuals must learn to be secure with insecurity; the goal is to amass security on your own rather than look to government help or collective action as sources of support.
With the rise of neoliberalism, the ethic of sharing risk among workers, employers, and the federal government that emerged after the New Deal was replaced by an aggressively free-market approach that pushed deregulation and privatization in order to minimize the role of government in economic life. At the same time, responsibility for social welfare has steadily devolved from the federal government to states, localities, and even the private sector. The push toward privatizing social services reached a new level when President George W. Bush, through his establishment of the office of faith-based organizations, sought to formally create public-private partnerships in which welfare provision would increasingly be supplied not by the government but by religious organizations. The result of this devolution of social services has been the replacement of a relatively stable, consistent system of safety-net programs with a patchwork of state, local, and private programs, all of which scramble to find funding.
Though many Americans may be unfamiliar with the risk shift story, the results are widely known. From 1980 to 2004, the number of workers covered by a traditional defined-benefit retirement pension decreased from 60 percent to 11 percent. In contrast, the number of workers covered by a defined-contribution retirement benefit like a 401(k) plan, in which the worker is fully responsible for saving and managing his or her savings, grew from 17 percent in 1980 to 61 percent in 2004.
Traditional employer-provided health care coverage began to erode as well. From 1979 to 2004, coverage dropped from 69 percent to 55.9 percent. In 2010, 49 million Americans were uninsured, an increase of close to 13 million people since 2000. For workers who continue to receive coverage, their share of the costs has increased drastically. A survey conducted by the Employee Benefit Research Institute found that to cover medical costs, 45 percent have decreased their contributions to other savings, 35 percent have had difficulty paying other bills, and 24 percent have had difficulty paying for basic necessities.
The Affordable Care Act, passed in 2010 and upheld by the Supreme Court in 2012, will greatly expand affordable health care. As a result of the legislation, it is estimated that by 2019, 29 million Americans will gain health insurance coverage. However, an equal number will still be uninsured. And the number of uninsured may rise depending on how many states opt out of expanding Medicaid eligibility. Currently twenty states will not participate in the Medicaid expansion. Analysis of states that won't expand Medicaid has found that, as a result, about 5.3 million people will earn too much under their state's Medicaid eligibility level to qualify but will earn too little to be eligible for tax credits that help offset the cost of insurance. Of the top ten least-insured metropolitan areas in the United States, seven are in states that will not expand Medicaid eligibility.
When it comes to aid for higher education, federal funding has grown, but that aid has mostly come in the form of loans rather than grants. Over the last decade, grants have made up between 22 and 28 percent of federal aid for education, while loans have made up between 61 and 70 percent. Moreover, even though there has been a 15 percent increase in the number of low-income students who receive a Pell Grant, the maximum award these students can receive now covers only about a third of the costs of a college education, as compared to around three-quarters in the 1970s.
The high price of a college degree is linked with a significant decline in the number of low- and moderate-income students who enroll in and graduate from college. Between 1992 and 2004, the percentage of low-income students enrolled in a four-year college decreased from 54 to 40 percent and the percentage of middle-income students decreased from 59 to 53 percent. For low-income children, the college completion rate has increased by only 4 percentage points between the generation born in the early 1960s and the generation born in the early 1980s. In contrast, among high-income children the college graduation rate increased 18 percentage points between generations. If education is the ladder by which less-advantaged Americans can hope to rise to the middle class and beyond, the rungs of that ladder are increasingly out of reach-yet another way in which the traditional system of shared social responsibility has been gradually dismantled over the past forty years.
With instability and uncertainty figuring prominently in people's lives, it is important to ask if these social and economic trends are reflected in the way Americans feel. Do Americans feel more insecure? Have they become more worried? This question turns out to be a difficult one to answer.
The first obstacle to figuring out the answer is that we lack rich, long-term survey data that would enable us to tease out an in-depth answer. As a recent Rockefeller Foundation report noted, efforts to assess and measure people's sense of security are rare. And the surveys we do have focus almost exclusively on job loss, which is just one risk among many that needs to be explored.
A second obstacle to measuring perceptions of security and insecurity across the decades is whether or not, over time, people continue to judge and evaluate their situations by the same criteria. In other words, can we assume that year in and year out people use the same yardstick to measure whether or not they are having a good or bad year? If assessments and meanings change over time and surveys don't capture these subjective changes, then it's not clear what our assessments are really measuring.
Analysis by Richard Curtin, the director of the Survey of Consumers at the University of Michigan, addresses the subjective nature of evaluation in his analysis of changes in the standards by which consumers have judged the economy over the last fifty years. For example, during the 1960s people had high expectations and were very confident about the government's ability to control the economy and keep things on track. Such optimism about rising affluence ran into a brick wall during the economic shocks of the 1970s and early 1980s. Initially dissatisfaction ensued as people continued to hold onto the economic aspirations from the past. By the mid-1980s, however, after repeated economic setbacks, consumers lowered their expectations about achievable growth rates and became more tolerant of high inflation and high unemployment. By the early 1990s, fears about job security grew as Americans became skeptical about the government's ability to use economic policy to prevent downturns.
At this point expectations were so diminished that it took one of the longest economic expansions in U.S. history to reset high levels of optimism. Fueled by the dot-com boom, aspirations soared. In 2000, consumer confidence hit a new peak. With expectations high, consumers in the early 2000s cited high unemployment as an issue even though it was only around 6 percent, half as much as it had been in the early 1980s. The optimism of the late 1990s soon gave way to pessimism because of the successive recessions of 2001 and late 2007. In fact, between January 2007 and mid-2008, the Index of Consumer Sentiment fell by 42 percent, the greatest percentage decline compared to any other recession.
By mapping out historical shifts in consumers' assessments of the economy, Curtin illustrates how "the same level of economic performance, say in terms of the inflation or unemployment rate, can be evaluated quite differently depending on what was thought to be the expected standard." Moreover, changes in standards of evaluation usually occur very slowly and therefore can be difficult to detect. And since different groups of Americans have fared differently as a result of macroeconomic changes, it stands to reason that some Americans may have altered their standards and expectations sooner than others, and some may have altered their aspirations more significantly, and perhaps more permanently. In all likelihood, for example, autoworkers had to let go of their expectations for a secure economic life long before and to a much larger degree than have college-educated Americans.
With this in mind, when sociologists Katherine Newman and Elisabeth Jacobs looked at survey data from the late 1970s to just before the Great Recession that examined people's economic perceptions, they found something interesting. Their analysis revealed that, despite a few peaks and valleys, overall trends during this period suggest that Americans came to see themselves as more secure and in better financial shape, with about the same likelihood of losing their job. As we might expect, their analysis found that those with the lowest incomes and least education expressed the most vulnerability to employment insecurity and financial hardship, while those with higher incomes and more education expressed lower levels of concern.
Tellingly, however, the group that showed the biggest increase in worry during that period is college-educated Americans and managers-the groups whose human capital (skills, credentials) have likely enabled them to retain their definitions of security in the face of economic changes that favored the educated and technically skilled. Over the last thirty years, the proportion of college graduates who said that they are likely to lose their jobs next year and the proportion who said they did worse financially this year than last year has gone up. The same pattern holds for managers as well. And these concerns were valid. During this period the rate of job loss for the most educated went up faster than the rate of job loss for less educated Americans. Also, when workers lost their jobs and found new ones, these new jobs often didn't pay as much. By 2001, workers with a bachelor's degree experienced about a 23 percent drop in their earnings after losing a job.
If discontent emerges when there is gap between expectations and outcomes, then it would make sense for concern to increase more among the group best positioned to maintain higher expectations about security and prosperity over the last several decades. It is very possible that worry as measured by feelings about job insecurity and financial hardship did not increase as much among other groups over a sustained period because they altered their expectations sooner and more permanently than did better-off Americans. As Newman and Jacobs point out, when those at the bottom lose a job, there is not as far to fall. For such families, their economic situation doesn't change much from year to year; it's always bad. Alternatively, other families may have taken on debt in order to hold onto their standards for security. The lack of a consistent and steep increase in worry among less well-off Americans thus does not necessarily signal that they feel more secure than they used to feel. To be sure, it could actually mean that they have gotten used to having less or gotten used to the high levels of debt required for them to hold onto traditional conceptions of security amid declining fortunes. What is also likely going on is that people's frame of reference for what security even means has undergone a transformation. Finally, it could also be the case that our standard measures for these issues (concern about job security and whether or not you are worse off this year than last) don't allow us to accurately assess people's feelings.
We do not have the kind of comprehensive longitudinal survey data that would enable us to detect subjective changes in Americans' views about what constitutes security and insecurity and whether such definitions shape trends in worry and concern over time. But other measures point to increases in insecure feelings among Americans. For example, even before the Great Recession started, about half of those surveyed worried somewhat about their economic security, with one-quarter "very" or "fairly" worried. By 2009, just over half of those surveyed were now "very" or "fairly" worried. A Pew Research survey done in 2011 found that only 56 percent of those polled felt that they were better off financially than their own parents were when they were their age, which is the lowest percentage since the question was first asked in 1981, when 69 percent said they felt better off. In 2012, the General Social Survey (GSS) found that less than 55 percent of Americans agreed that "people like me and my family have a good chance of improving our standard of living," the lowest reported level since 1987. That same year, the GSS also found that a record number of Americans (8.4 percent) identified themselves as "lower class," which is the highest percentage reported in the forty years that the GSS has asked this question.
And we may be seeing changes in the definition of the American dream. The American dream has long been equated with moving up the class ladder and owning a home, but recent surveys have noted shifts away from such notions. When Joel Benenson, chief pollster for President Barack Obama, examined voters' thoughts about economic security and the American dream in 2011, he found something new. His polling discovered that middle-class Americans were more concerned about keeping what they have than they were with getting more. A 2011 survey found the same thing. When asked which is more important to them, 85 percent of those surveyed said "financial stability" and only 13 percent said "moving up the income ladder." In 2007, a survey found that owning a home defined the American dream for 35 percent of those surveyed. By 2013, the top two definitions of the American dream were "retiring with financial security" (28 percent) and "being debt free" (23 percent). Only 18 percent of those surveyed defined the American dream as owning a home.
As the economy experienced wide-reaching transformations, meanings and feelings have likely changed along with it. A National Journal article noted how even the definition of being middle-class has undergone adjustment, especially in light of the rise of contract workers or "permatemps," those who may make a good wage but receive no benefits and can expect no job security. Capturing this adjustment, the article asks, "If they make a decent income, are permatemps middle class? Not by the standards of the past. But by the diminished redefinition, maybe they are: earning a middle-class living-for the moment."
Amid these shifting economic tides and morphing definitions, many have lost their way. While old beliefs such as that hard work will lead to security and prosperity have fallen by the wayside, it's unclear to many Americans what new truths lay in their stead. As President Obama's pollster Joel Benenson discovered, this lack of direction causes a great deal of unease. "One of the big sources of concern for the people we talked with," Benenson said, "was that they didn't recognize any new rules in this environment. All of the rules they had learned about how you succeed, how you get ahead-those rules no longer apply, and they didn't feel there was a set of new rules." These kinds of examinations suggest that in the age of insecurity, Americans are not just trying to weather an economic storm, but they are also feeling their way through the dark.
The Age of Insecurity
In the throes of the Great Depression, Americans decided that there had to be a better way to organize government and society, one that would allow individuals and families to enjoy greater stability and security. This philosophical shift from "rugged individualism" to "united we stand, divided we fall" paved the way for the New Deal, the Great Society, and the forging of an unwritten but pervasive social contract between employers and employees that rested on mutual loyalties and protections. The government invested in its citizens, employers invested in their employees, and individuals worked hard to make the most of those investments. As a result, in the decades immediately following World War II, prosperity reigned, inequality decreased, and a large and thriving middle class was born.
Beginning in the 1970s, this system began to unravel. Large-scale changes from globalization and the rise of the service economy to a philosophical shift toward free-market ideology and a celebration of risk changed the landscape of security in America. Against this backdrop, the government curtailed its investments in and protections of its citizens, and employers rewrote the social contract to increase their own flexibility and demand greater risk bearing by workers. Individuals continued to work hard, but instead of getting ahead, more Americans struggled harder and harder just to get by.
Insecurity now defines our world. The secure society has become the "risk society." The belief that we are all in this together has been replaced with the assumption that you're on your own. Cut adrift, Americans are struggling to forge security in an insecure age.
How are they coping with this new and often frightening reality? And how do the emerging strategies and psychological adaptations for managing insecurity vary from one social and economic group to another? These are the questions I will turn to next.
Members receive 20-40% discounts on book purchases. Find out more