Yesterday Paul Krugman wrote that there are people (who he calls "Very Serious People") that dismisses the austerity/stimulus debate as “simplistic” and that Keynesian concerns are “crude”. He said, "There were people like that during the Great Depression too — dismissing as naive any notion that you could put the unemployed back to work just by spending more."
World War II required massive government spending by the U.S. (aka "stimulus") which not only won the war, but took America out of the Abyss of the Great Depression. It also created the greatest middle-class that the World had ever known at the time. (Now China has the largest middle-class that's estimated to be about 500 million people — more than the entire population of the U.S.)
Since the end of the war, all the way up until the late-1970's (for almost 40 years), labor unions were strong, wages where more-or-less on par with productivity (profits) and CEO pay — and America's middle-class had prospered (even the inflation years of Jimmy Carter and the oil embargo were survivable).
But as David Cay Johnston recently reminds us (which can't be repeated often enough in this Second Gilded Age), in his article Economy Grows, Incomes Shrink he writes, "The decline in compensation for workers at every level below the handful of jobs paying $50 million or more tracks the decline in union membership among private-sector workers. Lacking bargaining power, most workers have had to accept flat to falling pay, a trend that has spurred local drives nationwide to raise the minimum wage to $15 per hour ... So long as government policy favors the richest among us, shields bankers from criminal and personal civil liability and removes regulatory controls on corporations, the trend line that began 34 years ago is likely to continue."
Just recently I wrote: "If the top 1% ... didn't want to bear any more of the tax burden, they could have raised wages over the past 35 years on par with worker productivity." --- I mention this because, after all I've read, I've use 1979 as the milestone when labor unions peaked, and also when the middle-class began to decline (within the same time frame Mister Johnston and many others have also referenced.)
Currently the Employment-Population Ratio is 59.2 — it was 60.1 in 1979 (not much different). But 36 years ago a lot less women were in the work force; and when two-income households weren't as prevalent as they are today. That was because dad could still take home enough bacon to pay the mortgage while mom stayed home.
Currently the Labor Force Participation Rate is 62.7 — it was 63.9 in 1979 (not much different). But back then, a single income household was usually enough to pay the bills before the cost-of-living began wiping out any wage gains since that time; then (accounting for inflation) wages stagnated as productively, corporate profits and CEO pay continued to rise (up until this very day).
And since 1979, the number of divorce rates and unwed mothers has increased substantially, forcing many more women into the labor force — so it wasn't JUST the "women's movement" that accounts for this increase.
Back in the day (during the 1950s, 1960s and 1970s) a true middle-class income could feed a family of four and pay a home mortgage — and one could still afford a 3 year auto loan on a new car. (These days auto loans can be for 5, 6 or 7 years).
But since the good ole days, the middle-class has shrunk, with many moving into the higher income brackets, but with the vast majority moving into the lower income brackets. A real middle-class income today might be closer to $75,000 a year — but most people don't earn near that much.
Arbitrarily, by using $75,000 as a true middle-class wage (whether someone is single, married, or has children — and no matter what region of the country one lives), then only about 20% of all wage earners are in the actual middle-class; and 7% are in the upper-middle-class (or wealthy) and everybody else (a whopping 73%) are in the lower-middle-class or poor.
Note that a "median household income" is the total combined income of all people living in one household — with "median" meaning, half of all households in the U.S. earned less and the other half earned more. Using "median income" is a more representative measurement for describing incomes rather than using "averages".
Previously Mister Johnston has made us aware of data from the Social Security Administration (SSA), which is released in October every year for the previous year — and showed that 50% of all wage earners (excluding those whose only income is from capital gains, and is reported to the IRS as "wages") took home $28,000 or LESS. If we double that number for a two-income household, we'd also have the "median household income" (which is still too low for a family of four to live a decent middle-class lifestyle in most parts of the country.)
Using this same data, we can also see that 86% of all wage earners make $75k or less (and 92% earn $100k or less); but yet, even Democrats in Congress want tax cuts for households making $200,000 a year. (Someone earning that much is very near the top one percent!)
SSA reports: "66.9 percent of wage earners had net compensation less than or equal to the $43,041.39 raw average wage. By definition, 50 percent of wage earners had net compensation less than or equal to the median wage, which is estimated to be $28,031.02 for 2013." NOTE: Income "averages" can skew the data because of very high incomes at the very top; whereas, "median" incomes tell a very different story.
Just be clear, incomes can be derived from many other sources as well, besides just hourly wages, salaries and capital gains. There are rents, royalties, dividends, Social Security retirement benefits, disability benefits, unemployment insurance benefits, pensions, lottery winnings, etc. (that also affects an individual's annual adjusted gross income).
And as a side note: The IRS says we had about 145 million tax returns, but Social Security reports about 155 million wage earners. We also have dual and multiple household incomes -- and joint tax returns --- which can actually make workers look like they earn more in proportion to everyone else.
Related posts by David Cay Johnston
Compensation
shrinks for all income groups – except the very highest (October 23, 2014)
Highest
earners making less, Social Security data show (August 18, 2014)
Minimum
wage laws aren’t just for workers with lowest wages (August 11, 2014)
Thanks you for the kind words.
ReplyDeleteAs for your gripe, the first IRS report on 2013 incomes does not present data in a way that the median (half make more, half less) can be computed, only the mean (averages). I also cautioned that data for high income Americans might change when the final data is released next Fall.
You cite the median wage data from Social Security that became available last October. Who broke the story of those figures? I did. Each year I am alone, or almost alone, in reporting on this data – and am absolutely alone in doing analysis of the kind I did in October, which you can read here: http://alj.am/1rlCSXS
Wages are about 70 percent of individual income. I reported that the 2013 median wage "was up a scant $109, to $28,031. That was still $320 below the 2000 median. It also was slightly lower than the 1999 median of $28,109, a troubling measure of long-term wage stagnation that is eroding the American work ethic and discouraging individual investments in acquiring and refining job skills."
I also did an exhaustive three-part analysis of that data comparing 2000 to 2012 last August: http://alj.am/1ANMJfx and http://alj.am/1o4R3y6 and http://alj.am/1sNI3Du.
Such long-term trends normally cannot be done because government reports data in fixed income brackets ($15,000 to $20,000) with no inflation adjustments. But I noticed that 2012 prices were exactly a third higher than 2000 ($15,000 becomes $20,000, etc.) and was able to make the exhaustive calculations used in the three columns linked above.
Thanks for the comment. If you'll notice, I've edited this post ;)
DeleteThe New Yorker: A Fair Day’s Wage
ReplyDeleteThe fact that the benefits of economic growth in the postwar era were widely shared had a lot to do with the assumption that companies were responsible, not only to their shareholders, but also to their workers ... Working on the Model T assembly line was an unpleasant job. Workers had been quitting in huge numbers or simply not showing up for work. Once Ford started paying better, job turnover and absenteeism plummeted, and productivity and profits rose ... One of the reasons retailers like Trader Joe’s and Costco have flourished is that, instead of relentlessly cost-cutting, they pay their employees relatively well ... Since the nineteen-seventies, a combination of market forces, declining union strength, and ideological changes has led to what the economist Alan Krueger has described as a steady “erosion of the norms, institutions and practices that maintain fairness in the U.S. job market.” This isn’t because companies are having trouble making money: corporate America, if not the rest of the economy, has done just fine over the past five years. It’s that all the rewards went into profits and executive salaries, rather than wages. That arrangement is not the result of some inevitable market logic, but of a corporate ethos that says companies should pay workers as little as they can, and no more.
http://www.newyorker.com/magazine/2015/02/09/fair-days-wage?mbid=rss