A 16-to-1 gold-to-silver ratio has been the Holy Grail of some silver investors since the mid-sixties.

Unfortunately, fifty years later, it is a quest that continues unabated without success.

In fact, there is evidence that contradicts and widens the chasm that separates wishful thinking from reality. 

In the Mint Act of 1792, the U.S. government arbitrarily chose a 16 to 1 ratio of gold prices to silver prices. The actual prices were set at $20.67 per ounce for gold; and $1.29 per ounce for silver.

Prior to 1792 the U.S. did not strike its own coinage. That changed with the establishment of the Philadelphia Mint, which was also authorized by the Mint Act of 1792. 

The official price of silver and the market value of silver remained relatively close until the late 1800s.

In 1859, prospectors discovered the Comstock Lode in Virginia City, Nevada. It was the largest silver vein in the world.

Combined with silver already in circulation, this additional supply “flooded the market” and forced the value of silver well below its official price of $1.29 per ounce. This is another classic, historical example of inflation in a pure sense – a devaluation of the money supply. The silver in a silver dollar was now worth much less than the official price of $1.29 per ounce. (Also see: Mansa Musa, Gold, And Inflation)

Congress responded promptly by passing the Coinage Act of 1873, ceasing all production of silver coinage in the U.S. Five years later it reversed itself by passing the Bland-Allison Act which restored silver as legal tender and required the U.S. Treasury to buy large quantities of it. Silver producers were awash with the metal and it was hoped that this new agreement would create more jobs within the mining industry.

A series of other legislative efforts either repealed earlier bills, and/or furthered the requirements of the U.S. Treasury to purchase silver to support the market or to use in the production of silver coinage.

Print Friendly, PDF & Email