A question has haunted the aftermath of the Great Recession: where is all the inflation? If the Federal Reserve is going to spike the money supply most massively in the context of minimal economic growth, as it has since 2008, it would seem that a severe price inflation must ensue. Yet the consumer price index has grown at merely 1.5% for the past five years.
Back in the “stagflation” era, the 1970s and early 1980s, and cued up by President Nixon’s taking the dollar off gold in 1971, the Federal Reserve printed money like never before. Inflation duly roared. From 1971 to 1981, the CPI leapt a phenomenal 125%, 8%-plus per annum. And in the nine years after the 1973 peak, economic growth averaged less than 2% per year.
Economic stagnation plus Fed activism equals stagflation: we saw it all thirty-five years ago. What gives these days? Why only half of the stagflation bargain?
The recent break in the gold price, in which the king commodity has completed a 20% drop since the high point of late last year, holds a clue.
In the 1970s, “inflation” was a general phenomenon, one of whose manifestations was an increase in that pet statistic of the government’s, the CPI. However, the main way that investors responded to the Fed blowouts of the era was not to bid up consumer prices or anything like that, but to get their holdings out of asset classes that brought dollar returns.
Amid an ongoing decline in the price of gold, a major brawl recently broke out in the elite media over … the gold standard. What is this free-for-all all about? And why does it matter? It matters because… the gold standard finally has demonstrated that, after a long eclipse, it is being taken seriously in elite (if not uniformly polite) company.
Paul Krugman, in a blog entitled Cranky Old Men, attacked a Sunday New York Times jeremiad by former OMB Director David Stockman. Stockman’s tirade, in fact, was more reminiscent of Allen Ginsberg’s Howl — “who burned cigarette holes in their arms protesting the narcotic tobacco haze of Capitalism … Moloch! Solitude! Filth! Ugliness! Ashcans and unobtainable dollars!” — than of an op-ed.
And yet, Krugman’s response possessed all the persuasive power of a 14-year-old’s sarcasm: “It’s cranky old man stuff, the kind of thing you get from people who read Investors Business Daily, listen to Rush Limbaugh, and maybe, if they’re unusually teched up, get investment advice from Zero Hedge. Sad.”
Matthew O’Brien of the Atlantic Monthly, playing Robin to Krugman’s Batman, botched a rescue operation. O’Brien got his facts badly wrong and came across as a propagandist, or apologist, rather than a serious analyst. O’Brien concluded, in The Atlantic Monthly, that “The gold standard didn’t save us from dystopia. The gold standard was dystopia.” Wrong. O’Brien was called out by the centrist Bloomberg and the center-right Forbes.com, his reputation bruised, for concocting a counterfactual counter-narrative.
For some reason, it is conventional wisdom today that the years 1870-1914, the era of the Most Perfect Monetary System Ever Created, was a time of chronic recession and disaster.
But how could that be? The United States was the world’s greatest economic success story of the last two centuries. When did that happen? It didn’t happen during the Civil War and the Great Depression. It must have happened — logically — during times of peace and prosperity.
That’s why Professor Brian Domitrovic says the usual story you hear about the gold standard years is completely wrong.
The United States actually did not return to a gold standard until 1879. However, by 1870, most of the rubble of the Civil War, including the floating “greenback” dollar, had been cleared up. Between 1870 and 1912, a period of forty-two years, industrial production in the United States rose by 682%.
Six hundred and eighty-two percent. Not too shabby.
In 2008, Ben Bernanke’s and Paul Krugman’s star former doctoral student at Princeton, the economist Gauti B. Eggertsson, published an article in the American Economic Review, the field’s top journal, on the manifest excellence of the New Deal of the 1930s. One of the claims: “1933-1937 registered the strongest output growth (39 percent) of any four-year period in US history outside of wartime.”
The cited source for this statement was the Office of Management and Budget, which does not maintain statistics on “output growth,” or GDP growth, from before 1929. It is not possible to use this source for a claim about “any four-year period in US history.”
As for databases that do include the whole run of U.S. history, those at measuringworth.com show an increase in national output of 43% from 1878 to 1882, a peacetime period.
Either 1933-37 was not the greatest peacetime four-year run of growth in American economic history, or the issue is unsettled, for lack of canonical sources. The untenable position was that presented in the AER. Not only is the 1933-37 claim probably wrong as a matter of fact; attached to the claim was a faulty apparatus of verification.
Chalk it up to yet another elision in the academic journals. But once these things are out there, they take on a life of their own, re-cited and re-enunciated as they are by readers and then their own readers and listeners in turn. I have certainly heard historians who teach considerable number of students approvingly quote Eggertsson’s claim about the supremacy of 1933-37 output growth.
In scholarship, often what happens is that stuff is put out there, it’s erroneous, it spreads, a mess results, and the mess has to be cleaned up. We seem now to be in mess stage about the macroeconomic quality of the New Deal.
A popular story promoted by Monetarist School thinkers is the one about Milton Friedman discrediting the Phillips Curve. For those not familiar with the latter, it’s the incorrect theory embraced by Keynesians that says economic growth is the cause of inflation.
Keynesians presume that speedy growth leads to labor and capacity shortages that result in higher prices. Of course if we ignore that labor and capacity are dynamic as opposed to static, and similarly ignore technological enhancements that allow companies to produce increasing amounts with less labor and capacity, we can’t ignore that the U.S. is not an island. Assuming shortages, American producers regularly access the world’s labor and the world’s factories such that growth could never impact the price level as is assumed.
About the U.S. not being an impregnable economic island, monetarists should take note as their theory similarly presumes Fortress U.S.A. They believe dollar credit is controlled by the Fed through the banks it regulates, as opposed to credit for dollars being a rather broad concept such that any Fed ‘tightness’ has historically been made up for by inflows of dollars (think the eurodollar market among countless others) from around the world.
Monetarists correctly argued that inflation is always a monetary phenomenon, but the newly revived theory that was long ago dismissed even by Friedman is merely a variation of the much discredited Phillips Curve. To put it plainly, monetarism is a parallel version of Keynesian demand management.