The Hole in the Bucket

Americans obsessed over personal finance during the last forty years as never before. So how come so many of us wound up broke? Here's the little-known story.
July 11, 2012 |
Click here to read this full article.

Never before in history has the great American middle class obsessed so much over financial planning as during the last forty years or so. In the 1970s, this obsession fueled the growth of hot new magazines like Money and TV shows like Louis Rukeyser’s Wall $street Week. By the 1980s, it had led to the creation of personal finance sections in almost every newspaper, and to myriad radio talk shows counseling Americans on what mutual funds to buy, how much they should put into new savings vehicles like Individual Retirement Accounts or Keoghs, and how to manage their new 401(k) plans.

In the 1990s, millions of Americans learned the accounting program Quicken, avidly followed the tips offered by Jim Cramer and the Motley Fool, and employed legions of tax and financial advisers and online tools to help them figure out whether they should convert to a Roth IRA and how they should take advantage of the new “529” college savings accounts. In the last decade, millions more have turned to outlets like HGTV to learn the ins and outs of flipping houses, consolidating credit card debt with a home equity loan, and combining a medical savings account with a high-deductible insurance plan.

Who in the 1950s ever worried so much about managing money?

And yet here we are today. According to a recent study by the Employee Benefits Research Institute, fully 44 percent of Baby Boomers and Gen-Xers lack the savings and pension coverage needed to meet basic retirement-age expenses, even assuming no future cuts in Social Security or Medicare, employer-provided benefits, or home prices. Most Americans approaching retirement age don’t have a 401(k) or other retirement account. Among the minority who do, the median balance in 2009 was just $69,127. Meanwhile, the college students who graduated in 2011 started off their adult lives encumbered by an average $25,000 in student loans.

What went wrong? We can all come up with scapegoats, of course. It’s common to hear, for example, that America became a nation of impulse shoppers and spendthrifts over the last generation. But like a lot of conventional wisdom, this consensus isn’t just wrong, it’s mean. The average American household actually spends significantly less on clothes, food, appliances, and household furnishings than did its counterpart of a generation ago. There is, however, a deeper story to tell—one that is still largely unacknowledged in our political debates.

As far back as the 1970s, many thoughtful observers could see the basic outlines of the middle-class insolvency crisis we are facing today. Some were even able to predict accurately, decades in advance, that it would all start coming to a head around the end of the 2000s. The big new trend they focused on was changing demography. By the end of the ’70s, birth rates were falling precipitously, meaning that the rapid population growth that had come with the arrival of the Baby Boom generation in the 1950s was over. At the same time, Americans were living longer and retiring earlier, and the poverty rate among children was starting to rise sharply.

This sea change had all sorts of implications. But one of the most obvious—to, among others, thoughtful liberals like Joseph Califano, a member of President Carter’s cabinet, and Senator Patrick Moynihan—was the long-term challenge the shift presented to the financing of Social Security and private pension plans. At the time, Social Security was paying out benefits to retirees that exceeded the value of their contributions by between $250,000 and $300,000 in today’s money, as Sylvester Schieber, former chairman of the Social Security Advisory Board, recounts in his recent book, The Predictable Surprise. Yet while these windfalls had set expectations among Americans of all ages about what constituted a minimally comfortable middle-class standard of living in retirement, they were clearly unsustainable.

By the end of the 1970s, the wages of most young people paying into the system were stagnating. By 1983—even with the arrival of the full Baby Boom generation into the work-force, including gigantic increases in the number of working women—Social Security simply ran out of money.

In perhaps the last major act of bipartisanship in Washington, President Ronald Reagan and House Speaker Tip O’Neill came together to “save” the system. But by steeply raising payroll taxes on workers and cutting their future benefits, they made Social Security a much less generous deal for generations going forward. Most would get little more back from the system than they paid in. Baby Boomers and their children would just have to learn how to live without the windfall benefits of the past.

At the same time, another major prop that had sustained the standard of living of the past generation was also clearly in trouble: private pensions. Like Social Security, these accounts were being threatened by the declining ratio of workers to retirees. Adding to the challenge, many of the highly unionized old line manufacturing companies, which typically offered these kinds of pension plans, were starting to be subject to creative destruction, union busting, or, because of the shift to automation, substantial reductions in the numbers of workers paying into their plans.

From the point of view of younger workers in the emerging workforce, these traditional benefit plans, even when well funded, were also an increasingly bad deal. Most did not offer any benefits until a worker had been with a company for at least five years, sometimes ten, and benefit formulas tended to be heavily back-loaded in favor of those who spent a whole career with one company—an increasing rarity. The old pension plan system was particularly hard on women trying to combine part-time work with family responsibilities. For more and more workers, traditional pensions going forward would offer little or nothing.

Thus, reasonable people called for the creation of new 401(k) and other so-called “defined contribution” plans. Under the new plans, workers did not lose benefits just because they changed jobs. To be sure, they lost any guarantee of fixed monthly benefits for life, but if their investments did even modestly well, they could earn far more than they would have under traditional plans.

Meanwhile, there was no way that the new plans would encumber individual companies or the next generation with pension debts; by design, beneficiaries received only whatever the market value of their investment turned out to be. In broad outline, transitioning to these plans, in the face of the demographic and workplace charges America was experiencing, was a reasonable and rare example of the government and the private sector collectively engaging in prudent, long-term planning.

Except, of course, for a few “minor” details that hadn’t been thought through. This massive change of strategy—arguably on a scale of the New Deal in its ultimate effects on American life, only on the downside—might have worked, for example, had all Americans been required to save for retirement. But mandating savings was off the table, because some said that would be “paternalistic”—and in the age of Reagan and spreading free market libertarianism, that was an argument stopper.

The near majority of workers lacked access to 401(k)s, and only about a third of those whose employers offered the plans participated. This often meant, of course, that workers forfeited the free money offered through employer matching grants. In addition, the workers who joined were often clueless about how to manage their portfolios, alternatively putting their nest eggs into money-market accounts that didn’t keep up with inflation, or speculating in tech stocks.

Also in line with human nature, many also simply failed to save enough. In the years when the stock market was routinely returning 15 to 20 percent, it was easy to believe the hype and conclude that only a small amount of savings would be required to meet your retirement goal. Still more people would cash out their 401(k)s every time they changed jobs or had a short-term financial emergency. Others saw their retirement savings cut in half or worse by divorce settlements.

Meanwhile, the percentage of the population that took advantage of Individual Retirement Accounts was even lower. Partly this was due to inertia or lack of sophistication. But it was also due to the fact that people with little or no income could gain little or no tax relief by investing in an IRA or other defined contribution plan. As the system came to operate, the federal government spent billions of dollars offering tax subsidies to rich people who sheltered their income in retirement accounts while offering virtually nothing to those who most needed to save. As such, the whole system became pure “Robin Hood in Reverse”—which, of course, didn’t much bother Republicans, and many Democrats went along as well.

Another big problem, easy to see in hindsight, was that while we were telling ordinary Americans to trust their savings to the sharks on Wall Street, we also were telling the sharks that there were no more rules. By the end of the 2000s, millions of humble Americans had invested their savings in mortgage-backed securities, believing their broker’s assurances, or that of some raving head on cable, that these were safe. Many who bought even garden-variety mutual funds saw their returns eroded by hidden fees.

In the meantime, financial markets devolved to the point where there were no longer many safe ways to invest that would keep up with inflation, so people took on more and more risk as they chased higher yields, often having no idea what they were doing. Many turned to investing in their houses or buying bigger ones. Many also, in effect, wound up using credit card debt to finance their stock market and real estate speculations—as when, for example, people put money into mutual funds or an upgraded kitchen and then found they needed to use their credit cards to cover routine household expenses, like gas, groceries, or the rising cost of health insurance.

This brings us to the other side of the deteriorating balance sheet of most Americans, and one that was just as consequential in bringing us to where we are today: debt. It’s bad enough that American society blew the plan that was supposed to get Baby Boomers and younger Americans saving much more for their retirement: at the same time, we exposed the same population to an epidemic of predatory lending.

It’s almost impossible to exaggerate the drama of this story. To put it in some historical perspective, for as long as there has been credit flowing in human history—going back at least as far as the code of Hammurabi, circa 1750 BC—there have been laws to prevent usury. The Old Testament tells of the Prophet Ezekiel, who included usury in a list of “abominable things,” along with rape, murder, robbery, and idolatry. Roman law capped interest rates at 12 percent. According to the Qur’an, “Those who charge usury are in the same position as those controlled by the devil’s influence.” Dante condemned usurers to the seventh circle of hell, along with blasphemers and sodomites. Martin Luther argued that any interest rate above 8 percent was generally immoral, and the Puritans who settled the Massachusetts Bay Colony agreed, adopting America’s first usury law 150 years before the ratification of the Constitution.

Most of America’s Founding Fathers thought them right to do so. Notes law professor Christopher L. Peterson, “Throughout the history of the American Republic, all but a small minority of states have capped interest rates on loans to consumers with usury law.” In the Progressive Era, reformers pushed a Uniform Small Loan Law that capped interest rates at 36 percent, and limited them to specially licensed lenders adhering to strict standards of lending. As late as 1979, all states had laws of some sort that capped interest rates.

This short history of usury laws puts into perspective just how bizarre the credit markets of the United States have become over the last forty years. Usury law is, in the words of one financial historian, “the oldest continuous form of commercial regulation,” dating back to the earliest recorded civilizations. Yet starting in the late 1970s, some powerful people decided we could live without it.

First to go were state usury laws governing credit cards. Before 1978, thirty-seven states had usury laws that capped fees and interest rates on credit cards, usually at less than 18 percent. But in 1978 the Supreme Court, in a fateful decision, ruled that usury caps applied only in the state where the banks had their corporate headquarters, instead of in the states where their customers actually lived. Banks quickly set up their corporate headquarters in states that had no usury laws, like South Dakota and Delaware, and thus were completely free to charge whatever interest rates and fees they wanted. Meanwhile, states eager to hold on to the banks headquartered within their borders promptly eliminated their usury laws as well.

Later, in 1996, the Supreme Court handed usurers another stunning victory. In Smiley v. Citibank it ruled that credit card fees, too, would be regulated by the banks’ home states. You might think that market forces would set some limits on how high credit card fees and interest can go—after all, there are only so many creditworthy borrowers, and much competition for their business. But with shrewd use of “securitized” debt instruments and hidden fees, banks and other lenders found they could make more money from those who could not afford credit cards than from those who could.

And this was only the beginning. By the early 2000s, thanks to the combination of deregulation and “financial engineering” on Wall Street, middle- and lower-class neighborhoods across America were being flooded with what could be called financial crack. In the years between 2000 and 2003 alone, the number of payday lenders more than doubled, to over 20,000. Nationwide, the number of payday lender franchisees rivaled that of Starbucks and McDonald’s combined.

And along with the payday lenders came new, more vicious species of loan sharks: subprime credit card issuers, auto title lenders, private student loan companies charging up to 20 percent APR, check-cashing outlets, and subprime mortgage brokers and lenders. Just the hidden fees—what Devin Fergus of Hunter College-CUNY calls the “trick and trap fees”—on student loans, mortgages, and credit cards sucked billions of dollars a year off the balance sheets of American families.

Meanwhile, of course, expanding mortgage credit, combined with continued generous tax subsidies for those who borrowed to buy a home, drove up home prices beyond all reason, while causing millions of Americans to overinvest in real estate as the bubble grew. And then, catastrophe.

It was a perfect storm. One that today leaves 69 percent of Americans saying it’s harder for them to achieve the American Dream than it was for their parents, and a full 73 percent saying it will be harder yet for their children and grandchildren. One that, according to a slew of new studies, now makes it harder to climb the socioeconomic ladder in the U.S. than in many parts of supposedly class-bound Europe. One in which about a third of all children born into the middle class in the 1960s and ’70s have fallen out of it. One that has seen the net wealth of Latino households fall by 52 percent between 2007 and 2009, and that of African American households by 30 percent. One in which the typical American family is now so deeply in debt and bereft of assets that they could only survive a month or two without a paycheck or some form of government assistance.
What are we going to do about it?

It’s tempting to ask why we can’t just go back to the “golden era” before the 1970s. And when it comes to the regulation of financial institutions, we should, indeed, do that.

But if you have any idea how to restore us to another era of long-term, salaried employment with traditional pensions and health care benefits, please write a letter to the editor of this magazine, now. And don’t forget to explain how these pensions and employer-based benefits would serve the interests of those of us who must jump from job to job, who are trying to start a new business or work part-time as we raise families or care for an aging relative.

Certainly there are things we could do that would help to get wages moving up again and make jobs more secure at least for a while. We could close the door to immigrants, if you want to go there. We could impose high tariffs on imports. We could make it easier for workers to unionize. And, to be sure, we could find ways to “bend the cost curve” on health care, to make higher education more cost-effective, and maybe even, with enough R&D, come up with huge supplies of cheap, green energy. We could also put taxes on high-income Americans back to where they were during the Clinton administration. We could even raise the income tax rate on the top 1 percent to 100 percent—which would raise enough money to pay for Medicare for roughly three years.

But in the end, none of that helps much if Americans still can’t avoid predatory debt and save securely for life’s predictable expenses and necessary investments. Americans need to be able to finance periods of unemployment or retraining. And above all they need to finance that prolonged period most of us will experience when we become too worn out and frail to find or hold down a job in the economy of the twenty-first century. We have come through a long era in which “prosperity” was financed, in effect, by depleting the net wealth of the average American. Getting back to real prosperity requires not just more jobs, but also fundamental reforms that will help more Americans hold on to more of their income and rebuild their wealth.

Related Programs