After the gold standard, we got bigger government and a smaller dollar
August marks the 50th anniversary of Richard Nixon’s infamous decision to “close the gold window,” reneging on the U.S. government’s pledge to redeem dollars for gold. Although Nixon’s action spelled the end of the postwar Bretton Woods framework, the system in place circa 1970 was a pale shadow of the original gold standard.
To realize just how much governments have transformed their role in the people’s money, it is useful to explain how the system originally worked.
From the founding of the constitutional Republic through the eve of the Civil War in 1861, the federal government didn’t issue any legal-tender paper currency at all. Rather, for the official money, the federal government produced gold and silver coins stamped with various dollar denominations.
This is a critical distinction between the old and new ways: under Bretton Woods, the government “pegged” the dollar to gold (at $35 per ounce). U.S. authorities decided how many dollars they were going to print, and if they were too aggressive, then other central banks could turn in their dollars and eventually drain American vaults of the yellow metal. But before the Civil War, government officials weren’t in charge of picking the quantity of dollars at all. Rather, the public determined how many dollars were in circulation by presenting gold or silver to the government for minting as coins according to weights specified in law.
In other words, rather than peg the dollar to gold (or silver), the earlier authorities defined it as a specific weight of the precious metals. Just as a commonly understood meaning of “feet” and “bushels” was required for the enforcement of contracts, so too was the meaning of “dollar” required.
During the Civil War the Union famously issued “greenbacks” — green currency notes not redeemable in coin — while the Confederacy engaged in even more reckless inflation. By 1879, the federal government restored the convertibility of paper dollars to gold (at approximately $20.67 per troy ounce), while silver had been demonetized.
From 1879 until the outbreak of the World War in 1914, the United States was a participant in what has been called the classical gold standard, under which all world powers redeemed their sovereign currencies in a definite weight of gold. Yet note that there had already been a significant weakening of the “hardness” of American money: rather than walking around with gold and silver coins in their pockets, Americans were now in the habit of holding green pieces of paper called “U.S. dollars,” which could be redeemed for gold.
Another major blow occurred in April 1933, when newly inaugurated Franklin D. Roosevelt confiscated the nation’s monetary gold under threat of prison and a $10,000 fine. By 1934 the dollar had been “revalued” at $35 per ounce — implying a 41 percent devaluation — and it was illegal for Americans to even write contracts using the world price of gold as a basis for calculating the dollar payment.
As the Allied victory became clear in World War II in 1944, representatives from the major powers hammered out a postwar monetary framework at a posh hotel in Bretton Woods, New Hampshire. The so-called Bretton Woods Agreement established a global dollar-exchange standard, according to which central banks would hold reserves in the form of dollar assets. The U.S. government in turn promised that it would always redeem dollars for gold at $35 an ounce. However, this privilege was only extended to other central banks; regular citizens were not allowed to turn in dollars for gold.
It was this faint remnant of the original gold standard that Nixon finally killed on Aug. 15, 1971, when he declared that not even central banks could redeem dollars for gold. With its shackles fully removed, the Federal Reserve opened the monetary spigot, causing the severe price inflation and economic turbulence that characterized the 1970s.
But as our historical sketch has shown, Nixon’s move was merely the final act in a long play; the hardness of the dollar had been gradually weakened from the Civil War onwards. And we can check the Bureau of Labor Statistics’ series on the purchasing power of the dollar in major cities to assess the outcome under the different frameworks. From early 1922 through early 1929, during the “Roaring Twenties” before the great crash and Roosevelt’s Depression-era interventions, the purchasing power of the dollar was virtually unchanged, falling less than 1 percent cumulatively over the entire period. From early 1952 through early 1959, during the heyday of the Bretton Woods framework, the dollar lost a cumulative 9 percent of its purchasing power, about 1.2 percent per year. Yet from early 1972 through early 1979, in the wake of Nixon’s decision, the dollar lost a cumulative 40 percent of its value, or 7 percent per year.
Money matters, and the scope of government intervention affects the strength of the currency. Every increase in government power over the money has resulted in a more rapid dilution of the dollar’s purchasing power.
Robert P. Murphy is a research fellow with the Independent Institute in Oakland, Calif. He is the author of Choice: Cooperation, Enterprise, and Human Action (Independent Institute, 2015).