GOOD AND EVIL: IN MONEY WE TRUST - The End of Alchemy: Money, Banking, and the Future of the Global Economy - Mervyn King

The End of Alchemy: Money, Banking, and the Future of the Global Economy - Mervyn King (2017)


‘The love of money is the root of all evil.’

1 Timothy, 6:10 (King James Bible)

‘Evil is the root of all money.’

Kiyotaki and Moore (2002)

In the United States I studied at Harvard as a Kennedy Scholar.1 Later in life, I was a member of the interviewing panel to select new scholars. One young man, who was studying theology at Oxford, entered the room and, obviously a little nervous, sat on the chair in front of a line of eminent figures. The chairman, a distinguished philosopher, started by asking, ‘Tell me, does God have much of a role in theology these days?’ The young man blinked and never recovered. But it made me think that the question one should ask of economists is, ‘Does money have much of a role in economics these days?’

Money is misunderstood because it is so familiar, although not as familiar as many of us might wish. Its function in a capitalist economy is complex, and economists have struggled to understand it. It is not even easy to define because the word is used to mean different things: the notes and coins in our wallets, the value of our total wealth, sometimes even the power that wealth confers, as in ‘money talks’. Whatever it is, we seem to be in thrall to it. In his Epistle to Timothy, quoted above, St Paul put it more bluntly.

The management of money, in rich and poor countries alike, has been dismal. Governments and central banks may talk about price stability, but they have rarely achieved it. During the 1970s, prices doubled in the United States in ten years and in Britain they doubled in five years. In November 1923, prices in Germany doubled in less than four days and GDP fell by over 15 per cent during the year.2 That experience helped to undermine the Weimar Republic and contributed to the rise of Nazi totalitarianism.3 In the film Cabaret, set in Berlin in the 1930s, the MC at the Kit Kat Klub performs a song entitled ‘Money’, which includes the lines:

A mark, a yen, a buck, or a pound

Is all that makes the world go around.

Yet in recent years, with central banks printing money like never before (albeit electronically rather than by churning out notes) and a world recovery still elusive, you could be forgiven for thinking that money doesn’t make the world go round. So what does money do? Why do we need it? And could it eventually disappear?

As Governor of the Bank of England, I would sometimes visit schools to explain money, especially to the younger pupils. Bemused by the fact that I was actually paid for ‘hanging out with my friends’ (the only answer I could come up with to their question ‘What is a meeting?’), they were nonetheless certain about the value of money. I would hold up a £5 note and ask them what it was. ‘Money,’ they would scream. ‘Surely it’s just a piece of paper,’ I would reply, and make as if to tear it in two. ‘No, you mustn’t,’ they gasped, as I hesitated and asked them what the difference was between a piece of paper and the paper note in my hand. ‘Because you can buy stuff with it,’ they explained loudly. And so we went on to discuss the importance of making sure that the amount of stuff you could buy with my note didn’t change drastically from one year to the next. They all got the idea that low and stable inflation was a good thing, and that whatever form money takes, it must satisfy two criteria. The first is that money must be accepted by anyone from whom one might wish to buy ‘stuff’ (the criterion of acceptability). The second is that there is a reasonable degree of predictability as to its value in a future transaction (the criterion of stability).

Most ‘stuff’ is today bought not with notes and coins, but with cheques, debit and credit cards, and by electronic transfers drawn on interest-bearing bank deposits. Economists have long debated how to measure the amount of money in the economy. But since what is accepted as money changes over time with both technology and economic circumstances, the quest for a precise definition has little point. Some people prefer a narrow definition in which money comprises the notes and deposits issued only by the central bank or government. Others prefer a broad definition that includes deposits issued by private banks and accepted in transactions. Yet others would include unused overdraft facilities that can be spent at the borrower’s wish.4 In normal circumstances the amount of money available for the financing of transactions is better captured by a broad measure, although in a banking crisis, as we shall see, a narrower definition may be more appropriate.

When money satisfies the two criteria of acceptability and stability it can be used as a measuring rod for the value of spending, production and wealth. After the Normans conquered Britain in 1066, they put together an inventory of wealth - houses, cattle and agricultural land - in order to assess the taxable capacity of their new domain. Known as the Domesday Book, the survey (now available online) measured wealth in terms of pounds, shillings and pence, Anglo-Saxon monetary units still in use in my youth before the decimalisation of Britain’s currency in 1971.5

The view that money is primarily an acceptable medium of exchange - a way to buy stuff - underpins the traditional interpretation of the history of money. Specialisation created the need for people to exchange their own production for that of others. Adam Smith’s division of labour did not start with his pin factory. It is as old as the hills, almost literally, with the early specialisation between hunters and cultivators, and the development of a bewildering variety of crafts and skills from early civilisation onwards. Smith described how ‘in a nation of hunters, if anyone has a talent for making bows and arrows better than his neighbours he will at first make presents of them, and in return get presents of their game’.6 A man who spends all day making arrows in order to swap them for meat gives up the possibility of hunting himself for the chance of sharing in a larger catch. To be willing to specialise, the hunter who turns arrow-maker has to be sure that his partner in trade will deliver the ‘present’ of meat.

Smith explained that ‘when the division of labour first began to take place, this power of exchanging must frequently have been very much clogged and embarrassed in its operations.’7 He was referring to the absence of what economists call a ‘double coincidence of wants’: the hunter wants arrows and the arrow-maker wants meat. Without that double coincidence, exchange cannot take place through barter. If the arrow-maker wants corn, and the farmer who grows the corn wants meat, then only a sequence of bilateral transactions will satisfy their wants. Since the transactions are separated in time, and probably space, some medium of exchange - money - enters the picture to allow people to engage in their desired trades.

The history of money is, in this view, the story of how we evolved as social animals, trading with each other. It starts with the use as money of commodities - grain and cattle in Egypt and Mesopotamia as early as 9000 BC. Many other commodities, ranging from cowrie shells in Asia to salt in Africa, were deployed as money. It is, of course, costly to hold stocks of commodities with a useful value; salt kept as money cannot be used to preserve meat. Nevertheless, commodities continued to function as money until relatively modern times. Adam Smith wrote about how commodities like ‘dried cod at Newfoundland; tobacco in Virginia; sugar in some of our West India colonies’ had been used as money and how there was even ‘a village in Scotland where it is not uncommon … for a workman to carry nails instead of money to the baker’s shop or the alehouse’.8 Commodities that had an intrinsic value were used in communities where trust, either in others or in a social convention such as a monetary token, was limited. In the early days of the penal colony of New South Wales, managed by the British Navy, rum was commonly in use as money, and, during the Second World War, cigarettes were used as money in prisoner-of-war camps.

The cost and inconvenience of using such commodities led to the emergence of precious metals as the dominant form of money. Metals were first used in transactions in ancient Mesopotamia and Egypt, while metal coins originated in China and the Middle East and were in use no later than the fourth century BC. By 250 BC, standardised coins minted from gold, silver and bronze were widespread throughout the Mediterranean world.

Governments played an important role in regulating the size and weight of coins. Minted by the authorities, and carrying an emblem denoting official authorisation, coins were by far the most convenient form of money. Officially minted coins were supposed to overcome the problem of counterfeits and of the need to weigh precious metals before they could be used in a transaction - the need to protect the physical object used as money has always been essential to its acceptability. Adam Smith’s close friend, the chemist Joseph Black, said that while teaching at the University of Edinburgh, where students paid the professors in advance, he was ‘obliged to weigh [coins] when strange students come, there being a very large number who bring light guineas, so that I should be defrauded of many pounds every year if I did not act in self-defence against this class of students’.9 Counterfeiting continues today - indeed, coins are counterfeited more often than banknotes.

The use of standardised coinage was a big step forward. Technology, however, did not stand still. As the English economist David Ricardo wrote in 1816:

The introduction of the precious metals for the purposes of money may with truth be considered as one of the most important steps towards the improvement of commerce, and the arts of civilised life; but it is no less true that, with the advancement of knowledge and science, we discover that it would be another improvement to banish them again from the employment, to which, during a less enlightened period, they had been so advantageously applied.10

The drawback of using precious metals as money had been evident since at least the sixteenth century when the first European voyages across the Atlantic led to the discovery of gold and, especially, silver mines in the Americas. The resulting imports of the two metals into Europe produced a dramatic fall in their prices - by around two-thirds. So in terms of gold and silver, the prices of commodities and goods rose sharply. This was the first truly European Great Inflation. Prices increased by a factor of six or so over the sixteenth century as a whole. That experience demonstrated vividly that, whatever form money took, abrupt changes in its supply could undermine the stability of its value.

Even more convenient than coin is, of course, paper money, which has for a long time dominated our monetary system. The earliest banknotes appeared in China in the seventh century AD. Later banknotes from the Ming dynasty in China were made from the bark of mulberry trees - the paper is still soft to the touch today.11 The penalty for counterfeiting was death - as advertised on the notes themselves.12 If not backed by gold or some other commodity, paper money is what is known as a pure ‘fiat currency’ - it has no intrinsic value and, crucially, cannot be exchanged for gold or any other valuable commodity at the central bank. It is useful only insofar as other people accept it at face value in exchange for goods and services, and its value depends upon the trust people have in it. The earliest western experiment with paper money was conducted in the United States - not the new post-revolutionary nation, but the pre-revolutionary colonies on the eastern seaboard. Before American independence, the creation of money was the prerogative of the British government. Thus prevented from minting their own coins, the colonists rightly complained of a lack of money to support commerce.13 Whatever gold and silver existed in the colonies (sadly there were no gold or silver mines to provide a new supply) rapidly flowed out to pay for a regular excess of imports over exports to England, which resulted from trade restrictions imposed by the mother country. As a consequence, barter systems and commodity monies, such as tobacco, became the main method of exchange in the colonial economies. Students at Harvard College met their bills by paying in ‘produce, livestock and pickled meat’.14 There was a strong incentive to find a way to create a new form of money. In 1690, Massachusetts started to issue paper money and other colonial governments followed. In part the paper money thereby created was backed by explicit promises to redeem the notes in gold or silver at specified future dates, but partly it was a pure fiat currency.15 This was a monetary experiment on a grand scale. As that great man Benjamin Franklin wrote in 1767:

Where the Sums so emitted were moderate and did not exceed the Proportion requisite for the Trade of the Colony, such Bills retain’d a fixed Value when compar’d with Silver without Depreciation for many Years … The too great Quantity has, in some Colonies, occasioned a real depreciation of these Bills, tho made a Legal Tender … This Injustice is avoided by keeping the Quantity of Paper Currency within due Bounds.16

The issuing of such colonial paper money did not, on the whole, prove inflationary.17 By and large, the colonists understood Franklin’s admonitions and created sufficient paper money to meet the needs of commerce but not so much as to generate high inflation.

So far, I have described the traditional view of the history of money. It explains how and why commodity money came into existence, and the role of precious metals as standardised coins. But the replacement of commodity by paper money is more difficult to explain. Of course, it is more convenient to buy stuff with paper, but the paradox of money is that people choose to own something that has no intrinsic value, and pays no interest. Over time people chose to hold less of it, and money today largely comprises bank deposits rather than notes and coin. How did the liabilities of banks come to be used as money? To explain this we need an alternative history of money, one that focuses on the role of money as a store of value.

As early as Roman times, and despite the prevalence of coins, money and credit existed in the form of loan contracts. Wealthy individuals acted as banks by extending loans, with the bank’s owner often exploiting personal knowledge of his customers, and those claims on the borrowers were used by the owner to make payments because the recipients could in turn pass them on to pay for their own purchases.18 The claims met the criterion of acceptability. In medieval Europe, banknotes evolved out of promissory notes - pieces of paper issued as receipts for gold bullion deposited with goldsmiths and other merchants. The paper money so created was backed by the bullion held by the goldsmith. The holder of the paper claim knew that at any time it could be exchanged for gold. As it became clear that most notes were not in fact immediately converted into bullion but were kept in circulation to finance transactions, merchants started to issue notes that were backed by assets other than gold, such as the value of loans made by the merchants to their customers. Provided the holders of the paper notes were content to carry on circulating them, the assets backing those notes could themselves be illiquid, that is, not suitable for conversion quickly or reliably into money. From this practice emerged the system of banking we see today - illiquid assets financed by liquid deposits or banknotes.

The problem with private banks’ creation of money is obvious. Money in the form of private banknotes and deposits is a claim on illiquid assets with an uncertain value. So both its acceptability and stability can from time to time come under threat. The nature of the problem was illustrated by the experience of ‘free banking’ in the United States, when banknotes were issued by private banks and not central government (the Federal Reserve did not start operating until 1914). The so-called ‘free banking’ era lasted from 1836, when the renewal of the charter of the Second Bank of the United States was vetoed by President Andrew Jackson, until 1863, when the Civil War led to the passing of several National Bank Acts, which imposed taxes on the new issue of banknotes. During that period, most states allowed free entry into banking. For banks, loans are assets and banknotes and deposits are liabilities; the opposite is true for their customers. Hundreds of private banks made loans and financed themselves by taking deposits and printing banknotes. Their assets were holdings of gold and the value of the loans they had extended, and their liabilities were banknotes and deposits, the former typically comprising a larger proportion of liabilities than the latter.19

In principle, banknotes issued by private banks were exchangeable on demand for gold at the bank’s head office at face value, and were backed by a mixture of gold (or silver) and the value of the loan assets held by the bank. But when banknotes were exchanged at significant distances from the head office of the issuing bank, they often traded in the secondary market at discounts to their face value.20 Banknote Reporters - special newspapers that published the latest prices of different banknotes - sprang up to provide information on the value of unfamiliar notes. The discounts varied not only with distance from the head office, but also across banks and, over time, according to perceptions about the creditworthiness and vulnerability to withdrawals of the bank at that moment.

In 1839, an enterprising Philadelphia businessman, Mr Van Court, started to publish what became known as Van Court’s Counterfeit Detector and Bank Note List. It contained his measures of the discount in Philadelphia, then second only to New York as a financial centre, of different banknotes issued by the many hundreds of banks around the United States. For banks from Alabama, the average discount in Philadelphia varied from 1.8 per cent in 1853 to 25 per cent in 1842, and the maximum discount for a single bank was 50 per cent. Connecticut, a state with many more banks than Alabama, had several banks with discounts of over 50 per cent, but on average its banks rarely suffered a discount of more than 1 per cent. Illinois banks, by contrast, regularly experienced average discounts of well over 50 per cent.21 During the era of ‘free banking’ many banks failed and there were frequent financial crises.

The interesting feature of free banking was that it revealed the inherent tension between the use of bank liabilities as money, which requires that notes or deposits exchange at face value, and the risky nature of bank assets. If banknotes in the nineteenth century were exchanged at face value there was a serious risk that the underlying assets might one day be inadequate to support that valuation. There was also the possibility that the owners of banks would issue too many notes, invest in risky assets and, if necessary, shut down the bank and disappear. Worried about such risks, consumers accepted banknotes only at a discount. But since the discount fluctuated over time, the value of banknotes as a means of payment was diminished.

Banknotes were a store of (uncertain) value. If the prices of banknotes always correctly valued the assets of the bank, then the holders of the notes could not be defrauded by over-issue of paper money. But they would in effect have become like shareholders, with a claim on the underlying assets of the bank that varied in value over time. So the value of banknotes as money, with the accompanying requirements of acceptability and stability, was sharply reduced.

The tension inherent in the use of private bank liabilities as money led inexorably to the regulation of banks and, after the experiences with ‘free banking’, to the creation of the Federal Reserve System as America’s central bank. After the Great Depression, the introduction of deposit insurance, with the creation of the Federal Deposit Insurance Corporation (FDIC) in 1933, largely eliminated the risk to ordinary depositors. By transferring the risks to the taxpayer, deposit insurance reduced the likelihood of depositors running on their banks, but it cemented the role of banks as the main creators of money in the form of bank deposits with banknotes issued solely by government.

This alternative view of the history of money has the merit of explaining why bank deposits have come to comprise the vast majority of the money supply. They have an intrinsic value and offer a positive, if small, rate of return (either explicitly as interest, or, in the case of current accounts, implicitly in the form of subsidised money transmission services). As a result, they dominate the value of notes and coin in circulation. Over the past century, the amount of money in the US economy - defined broadly - has remained roughly stable as a proportion of GDP, at around two-thirds, and the share of bank deposits in total money has also been roughly constant at around 90 per cent. Gold and silver, which a hundred years ago amounted to around 10 per cent of total money and were of equal importance to notes and coin, are no longer counted as money. The share of bank deposits in total money is even higher in other major countries, at 91 per cent in the euro area, 93 per cent in Japan and no less than 97 per cent in the United Kingdom.22 What is striking about these figures is that the production of money has become an enterprise of the private sector. The amount of money in the economy is determined less by the need to buy ‘stuff’ and more by the supply of credit created by private sector banks responding to the demand from borrowers. In normal times, changes in the supply of credit will be driven by changes in the demand from borrowers to which banks react, and in turn those developments will reflect the influence of the interest rate set by the central bank. So the fact that banks are the main creator of money does not prevent a central bank from being the major influence on the amount of money in the economy. Credit booms are less the result of irresponsible lending by banks and more the outcome of optimism on the part of borrowers, aided and abetted by low interest rates and competition between banks to meet customers’ demands.23 In a crisis, however, changes in the supply of credit may reflect a shift in the willingness of banks to lend, or the market to fund banks, as perceptions of the soundness of the banks are revised downwards. In those circumstances, it is much harder for a central bank to offset the contraction of money by stimulating demand for borrowing, as events since 2008 have shown.

In its role as an acceptable medium of exchange, money is not only necessary, it is a social good. As the historian of Rome, Edward Gibbon, expressed it: ‘The value of money has been established by general consent to express our wants and our property, as letters were invented to express our ideas; and both these institutions, by giving more active energy to the powers and passions of human nature, have contributed to multiply the objects they were designed to represent.’24 But the amount of money created by a private banking system may not always correspond to the amount that is socially desirable. Indeed, where the former exceeds the latter there is a risk of financial excess and inflation, and where the former falls short of the latter there is a risk of a financial crisis. Should money be created privately or publicly? The answer depends on how the choice affects the twin criteria of acceptability and stability.

Acceptability in good and bad times

The traditional view of the history of money stresses the importance of acceptability in transactions for ‘stuff’ - purchases of goods and services. Far more important, however, in a modern economy is the acceptability of money in financial transactions, including the making or repaying of loans, or the buying and selling of financial assets. In situations of extreme uncertainty, some forms of money may no longer be accepted as a means of payment. Cheques, for example, may be refused if there is doubt about the solvency of the bank on which they are drawn.25 In October 2008, the Bank of England saw a sharp rise in the demand for £50 notes as confidence in banks fell - matched by a rise in sales of home safes!26 Moreover, in periods of great uncertainty, the amount of money people want to hold as a liquid store of value may rise sharply. To fulfil its functions, money needs to be acceptable in bad times as well as good, and to be available in sufficient quantities. That is why there is a very close link between money and liquidity, where the latter is the property of a non-monetary asset to be convertible into money quickly and at little cost. Some assets are more liquid than others; for example, stocks and shares of large companies are liquid, houses are not.

Before the recent crisis, financial experts believed that the ‘deep and liquid’ markets in which most financial assets were traded meant that there would always be sufficient access to liquidity. That illusion was destroyed by the events of 2007 and 2008. Some of the ‘deep and liquid’ markets simply closed, not to reopen for many years (mortgage-backed securities, for example, which are discussed in Chapter 4). Others became suddenly illiquid, with a large difference between the price at which one could buy and the price at which one could sell, such as commercial paper issued by non-financial companies (essentially an IOU promising to pay a fixed sum at a specified date a few months hence). It became clear that the only truly liquid assets were cash and bank deposits. As the latter shrank when banks began to stop lending, the Bank of England and the Federal Reserve stepped in to boost total deposits in the banking system. They did this by creating ‘emergency money’ with which to buy large quantities of paper assets (primarily government securities) from the non-bank private sector.

When a central bank buys or sells assets it adds to or subtracts from the supply of money. Someone (usually a financial institution in an auction) who sells $1 million of government bonds to the Federal Reserve, receives a cheque drawn on the Fed. When that cheque is deposited in the person’s own bank account, which increases by $1 million, the bank presents the cheque to the Federal Reserve, which then credits the bank with $1 million in its reserve account at the central bank. The immediate effect is that both the money supply and central bank reserves rise by $1 million. The same argument holds in reverse when the central bank wants to reduce the money supply. Changing the amount of money in the economy in this fashion using electronic transactions is simpler and faster than printing notes. But it is creating money just the same. It boosts the money supply by increasing bank deposits.

The sharp increase in the demand for liquidity in 2007-8 was met by the creation of more central bank reserves. This was not because households, companies or banks wanted more money to buy ‘stuff’, but because central bank money was a store of liquidity that offered protection against a very uncertain future for the banking system. Inherent in this role for money is that its demand is liable to sudden and unexpected swings, and it is to such changes that the supply of emergency money must respond. The creation of emergency money adds to the total stock of central bank reserves and notes and coin - known as the ‘monetary base’.

Sharp changes in the balance between the demand for and supply of liquidity can cause havoc in the economy. The key advantage of man-made money is that its supply can be increased or decreased rapidly in response to a sudden change in demand. Such an ability is a virtue, not a vice, of paper or electronic money. When there is a sudden increase in the demand for liquidity it is imperative to increase the supply of the asset that constitutes liquidity in order to prevent a damaging rise in the price of that asset and a corresponding fall in the price of goods and services in the economy. Because gold was in limited supply, those countries that used gold as their monetary standard in the late 1920s and early 1930s suffered falling prices (deflation).27 A sudden fall in the general level of prices tends to go hand-in-hand with a fall in spending today, as households and companies wait to buy things more cheaply tomorrow. The ability to expand the supply of money in times of crisis is essential to avoid a depression. A crisis could, in fact, be defined as a set of circumstances in which the demand for liquidity suddenly jumps.

What the experience of emergency money reveals is that the private sector will not always be able to meet the demand for acceptable money. In bad times, governments may need to issue assets which will be regarded as both acceptable in making payments and reliable as a store of value. To leave the production of money solely to the private sector is to create a hostage to fortune. But there must be confidence in the process that generates changes in money. In an era of paper money, that amounts to trust in the central bank or government that controls money creation.

Stability of the value of money

The second criterion for money to be able to perform its functions in a capitalist economy is that its value - its purchasing power in terms of goods and services - must be in some sense stable. Defining price stability in a world where new goods and services come along that were not available before is a hazardous undertaking. Official statisticians are always adding new entries to the basket of goods and services that they use to calculate the average price level and measure inflation. And they also remove goods and services that no longer account for much of our spending. In 2014, the Office for National Statistics in the UK removed DVD recorders and gardeners’ fees from the Consumer Price Index and replaced them by films streamed over the Internet and fresh fruit snacking pots. Leaving measurement issues to one side, the big question is whether governments can be trusted to maintain the value of money.

Since the Civil War, dollar coins in the United States have exhibited the words ‘In God We Trust’, and the motto has appeared on dollar bills since 1957.28 Trust is fundamental to the acceptability, and so the value, of money. But it is trust not in God but in the issuer of money, usually governments, that determines its value. And that trust has been sorely tried over the centuries. Whether clipping the coinage (shaving some of the precious metal from the edge of the coin), devaluing the currency or restricting the convertibility of notes into gold, governments, east and west, north and south, have found ways to renege on their promises. It is an old tradition. As Sir William Hunter of the Indian Civil Service wrote in 1868, in his study of Bengal:

The coinage, the refuse of twenty different dynasties and petty potentates, had been clipped, drilled, filed, scooped out, sweated, counterfeited, and changed from its original value by every process of debasement devised by Hindu ingenuity during a space of four hundred years. The smallest coin could not change hands, without an elaborate calculation as to the amount to be deducted from its nominal value. This calculation, it need hardly be said, was always in favour of the stronger party.

Much of the financial history of the past 150 years is the story of unsuccessful attempts to maintain the value of money. The willingness of governments to debase the currency has been illustrated many times - indeed, almost all paper currencies have suffered a massive loss of value, through intention or incompetence, at one time or another - including in medieval China, France during the Revolutionary period, the revolutionary war in the United States with its Continental currency, the American Civil War with the greenback dollar, Germany in the 1920s under the Weimar Republic, Eastern Europe following the collapse of the Soviet Union, and, most recently, Zimbabwe in 2008 and North Korea in 2009.29 Less dramatically, many industrialised countries, including the United States and the United Kingdom, experienced the Great Inflation of the 1970s.

In a democracy, people cannot be forced to use paper money, although after the French Revolution the Jacobins had a try. They made it a capital offence to use commodities as money. This was a desperate and unsustainable action resulting from the Jacobin policy of debasing their paper money - the assignat and mandat - to make up for a collapse in tax revenues and to finance a war against Prussia. And a few years later, in 1815 when Napoleon, after his defeat at Waterloo, was travelling back to Paris to rally his forces, an innkeeper at Rocroi refused to accept a chit for 300 francs as payment for dinner for the Emperor’s entourage, demanding payment in gold instead - ‘as sure a sign as any of Napoleon’s waning authority’.30

When a government is in crisis, there is usually an exodus from the paper money it issues, a collapse of the currency and ‘hyperinflation’ - which is usually defined as a period in which the monthly rate of inflation goes above 50 per cent. That may not sound so bad, but it is equivalent to an annual rate of inflation of well over 1000 per cent. Perhaps the simplest definition of a hyperinflation is when it becomes impossible to keep track of the inflation rate. In the worst hyperinflations the peak monthly inflation rate was several million per cent. It is easier to measure such hyperinflations by the length of time it takes for prices to double (in hours). At the peak of the hyperinflation in Germany, in November 1923, prices doubled every three and a half days. No wonder people paid for their lunch at the beginning of the meal. Printers, busy producing more and more notes, went bankrupt because their machinery wore out sooner than expected and they could not accumulate sufficient reserves to invest in new equipment. The economy collapsed.31 At the end of the First World War, in November 1918, a gold mark (the standard on which paper money was based) was worth 2.02 paper marks. By November 1923, one gold mark was worth one trillion paper marks!32

That experience shaped German attitudes to inflation, and the memory lingers today. But it was by no means the worst hyperinflation on record. That was in Hungary in July 1946 when prices, in terms of the pengö currency, doubled every fifteen hours. There are poignant photographs showing children in Germany playing with bricks made out of worthless paper marks and of street cleaners in post-war Hungary sweeping away piles of pengö notes because it was not worth the effort of picking them up from the pavement.

There is a natural tendency to think of hyperinflations as belonging to the history books. Far from it. The second worst hyperinflation in history took place in Zimbabwe during the first decade of this century. Few economies have collapsed quite so spectacularly as that of Zimbabwe. As inflation rose to absurd levels, citizens abandoned the local currency and started to use foreign currencies. Once some did, others followed - an instance of good money driving out bad. The sale of pre-paid minutes of mobile phone time also flourished as a substitute currency, as it did in a number of other African countries. Inflation peaked in November 2008, at which point prices were doubling every day. The use of other currencies, especially the US dollar, became official policy in early 2009. As a result, inflation quickly dropped to single digits and economic growth resumed.

All of these hyperinflations were caused by the excessive printing of money to finance government deficits that had been allowed to spiral out of control. But even in countries with more stable institutions, such as the United Kingdom and United States, inflation has eroded the value of money over the past century.33 Both countries saw price stability in the nineteenth century, only to experience significant inflation in the twentieth century when prices accelerated rapidly, especially in the immediate aftermath of the First World War and in the later post-war period. The experience of the two countries was broadly similar until the 1970s, when even more rapid inflation in Britain led to a divergence of their price levels. But over the past twenty-five years or so, annual inflation has come under control and averaged close to 2 per cent, which is the current inflation target in the United States, the United Kingdom, the euro area and Japan. Other countries, too, have experienced high inflation; few can match the record of Switzerland, which has experienced an average inflation rate of only 2.2 per cent a year since 1880.

Why has money been so difficult to manage? Part of the answer is the failure of political institutions to avoid the temptation to create money either as a source of revenue or a way to court popularity by engineering a short-term boost to the economy before the resulting rise in inflation becomes apparent. But there have also been significant advances in our understanding of how to manage money. The creation of independent central banks, with a clear mandate to maintain the value of the currency in terms of a representative basket of goods and services (inflation targeting), proved successful in stabilising inflation in the 1990s and early 2000s during the Great Stability. The conquering of inflation across the industrialised world over the past twenty-five years was a major achievement in the management of money, and one, despite the financial crisis, not to be underrated. It was the result of successful institutional design (see Chapter 5).

Nevertheless, designing a system of monetary management that is capable of achieving price stability - providing the right amount of money in good times - and coping with crises - providing the right amount and quality of emergency money in bad times - is by no means straightforward. Neither the private nor the public sector has an unblemished record in striking a balance. That is why over the years, and right up until today, there are those who continue to search for a deus ex machina to provide monetary stability.

Gold versus paper

For some the answer is gold. Indeed, in the United States there is a degree of political support for a constitutional amendment to abolish the Federal Reserve Board and allow money supply to be determined by an automatic link to gold.34 Few debates in economic history have attracted so much passion as that of the merits of gold versus paper as the basis for our monetary system. Two hundred years ago, William Cobbett railed against the iniquities of paper money and the policies of successive British governments that had broken the link with gold during the Napoleonic Wars. He edited The Political Register, a radical newspaper, and used as his motto ‘Put me on a gridiron and broil me alive if I am wrong’! He was obviously not an economist. In 1828 he published a book, written while imprisoned for treasonous libel, entitled PAPER AGAINST GOLD; or, The History and Mystery of the Bank of England, of the Debt, of the Stocks, of the Sinking Fund, and of all the other tricks and contrivances, carried on by the means of Paper Money.

The book lives up to its title. As the author points out,

The time is now come, when every man in this kingdom ought to make himself, if possible, well acquainted with all matters belonging to the Paper-Money System. It is that System, which has mainly contributed towards our present miseries; and, indeed, without that System those miseries never could have existed in any thing approaching towards their present degree. In all countries, where a Paper-Money, that is to say, a paper which could not, at any moment, be converted into Gold and Silver, has ever existed; in all countries, where this has been the case, the consequence, first or last, has always been great and general misery.

Gold has held a special position as money down the centuries and across the globe. The Egyptians used gold bars as a medium of exchange as far back as the fourth millennium BC. Even when paper money came into existence, its acceptance usually depended on its convertibility into gold. Major currencies were readily convertible into gold at a fixed exchange rate - the ‘gold standard’, as it was called. A country on the gold standard promised to exchange its notes and coin for gold at a fixed price. When a country joined the gold standard its exchange rate against other member countries became fixed. If the exchange rate of, say, the US dollar against the French franc were to fall, then it would be cheaper for American importers of French goods to pay in gold than in depreciated dollars. Gold would flow to France. The US would have less gold to back its supply of paper dollars, which would then contract, pushing up the value of the dollar until it returned to its official price in terms of gold. Although the cost of transporting gold allowed small fluctuations in exchange rates before physical movements of gold became attractive, the automatic nature of such movements kept exchange rates in line.

For most of the nineteenth century, and right up until the early 1930s, the price of gold was fixed at $20.65 per ounce.35 The Great Depression saw a revaluation of gold to around $35 an ounce, where it stayed until 1971, when the United States abandoned the policy of a fixed dollar price of gold. Inflationary pressures in the US, stemming in part from the Vietnam War, put downward pressure on the dollar. Rather than face the recessionary consequences of the need to lower wages and prices to maintain a fixed rate against gold and other major currencies, President Nixon decided to break the link between the dollar and gold for good. The price of gold (per ounce) then rose steadily to around $160 in the inflationary 1970s, moved higher in the 1980s, and fell back only in the 1990s as inflation was conquered. But from around 2000 it rose steadily again, reaching a peak of almost $1800 in 2011 before falling back to below $1100 by late 2015.36 The price of gold is not only volatile but highly sensitive to changes in sentiment about the ability of governments to control their monetary system.

The tension between paper and gold as the basis for our monetary system was revealed to me every day during my time at the Bank of England. The Governor’s office leads on to a small garden in which are planted a number of mulberry trees. The reasons for choosing that type of tree were twofold. First, it was a deliberate homage to the use of their bark in the production of early Chinese banknotes. Second, mulberry trees grow in shallow soil. The soil in the garden had to be shallow because only a couple of feet below was the ceiling of the enormous vault in which the large gold reserves held by the Bank of England were stored. Paper and gold were linked by the trees in this small garden. The garden - and the Bank - had hedged their bets.

The persistent attraction of gold as an acceptable medium of exchange in any set of circumstances stems from the fact that, apart from new mining, its supply is fixed, independent of human decision, and its weight and value can easily be checked. New mining adds only a small amount to the total stock of gold each year. Today, gold mining uses highly advanced technology to dig gigantic open pits. Arguably the largest hole in the ground anywhere in the world is the Super Pit at Kalgoorlie in Western Australia. After around fifteen years, the diggers in Kalgoorlie have created a hole that is 4.5 kilometres in length, 1.2 kilometres wide and 500 metres deep. Excavating this hole has yielded 27 million cubic metres of earth which, after processing with acid, has yielded just 10 cubic metres of gold.37 Those few cubic metres were, however, worth around US$8 billion at the prices of early 2015. The investment paid off. Nevertheless, the 200 tonnes of gold mined from this extraordinary hole is small compared with the 5.5 thousand tonnes (the vast majority of it owned by foreign governments and not the UK), worth US$235 billion, that sat in the vaults underneath my office in the Bank of England and the 6700 tonnes, worth almost US$300 billion, in the vault of the Federal Reserve Bank of New York in downtown Manhattan.38

For centuries gold has been the most widely accepted form of payment. It is independent of government, and, ironically, governments themselves want to hold reserves in gold because they do not trust other countries to maintain the real value of claims denominated in their own paper currency. But despite its attractions, gold suffers from two major drawbacks as money. First, it is extremely heavy and inconvenient to use, and even when gold coins were used widely by travellers (in the way that we might use travellers’ cheques or credit cards today), coins of smaller denominations were usually made out of metals such as bronze or copper.

The second drawback is more fundamental. The attraction of gold to many - namely that its supply cannot easily be expanded by governments - is in fact a serious weakness. In times of financial crisis, paper money can be created quickly and easily when the demand for liquidity is high; not so the supply of gold. Almost invariably, the gold standard was suspended during a financial panic. The most notorious example was in Britain in 1797 during the wars against Revolutionary France, arguably the first financial crisis in a modern economy. An attempted invasion by 1200 French soldiers was thwarted but added to public concern about the value of paper money; a rush to gold ensued. With the Bank of England’s gold reserves disappearing fast, William Pitt the Younger slapped an order of the Privy Council on the Bank, suspending the convertibility of notes into gold. The printing of notes was stepped up, and the result was inflation and a series of wonderful cartoons by James Gillray. One showed the Bank as a lady of a certain age being violated by the Prime Minister as he tries to get at her gold.39 It was the origin of the Bank’s later nickname, the Old Lady of Threadneedle Street. Convertibility into gold resumed in 1821 and was maintained right through until the First World War.

In normal times, the problem with the gold standard was that, with gold in effectively fixed supply, economic growth meant upward pressure on the price of gold in terms of goods and services. Since the dollar (and sterling) price of gold was fixed, there was, conversely, downward pressure on the dollar (and sterling) prices of goods and services. That deflationary pressure, squeezing wages and profits, pushed down activity and employment. The commitment to gold was seen as a battle between bankers and financial interests, on the one hand, and working people, on the other. Never was this expressed so forcefully as by William Jennings Bryan, the three-time losing Democratic presidential candidate, who concluded his speech to the party convention in 1896 with the words ‘you shall not crucify mankind upon a cross of gold’.40

Keynes’s famous description of gold as ‘a barbarous relic’ was apposite.41 What was especially ‘barbarous’ was the decision in the 1920s to impose substantial deflation on economies in order to go back to the gold standard at the same parities as existed before the First World War. It is certainly arguable that a return to a different set of parities might have enabled a system of fixed exchange rates to be retained, while not putting those economies through a period of deflation which, in the event, led not only to the Great Depression but to the inevitable abandonment of the gold standard itself, starting with Britain in 1931. But breaking the link to gold made it possible to expand the supply of money, and countries were then free to adopt looser monetary policies at home - an appropriate response to the world of the Great Depression. The search for a reliable anchor for the monetary system has continued ever since.

Following the 2008 crisis, both the Federal Reserve and the Bank of England expanded the supply of money sharply in order to meet a sudden increase in demand for liquidity. If the money supply had been determined solely by the available quantity of gold in the world, then neither central bank would have found it easy to prevent a depression. To be sure, enthusiasts of gold and critics of paper money argue that crises would be much less frequent in the absence of discretionary monetary interventions by governments and their central banks.42 But the history of nineteenth-century America, before the creation of the Federal Reserve, does not suggest that a world without central banks would be free of crises. The choice between basing our monetary unit of account on either gold or paper money managed by a central bank has largely been resolved in favour of the latter, partly because of the advantages of discretion in controlling the supply of liquidity during a crisis and partly because of the success in conquering inflation during the 1990s.

But for many, the crisis of 2007-9 is evidence of the continuing folly of central banks and the attractions of an automatic standard for the value of money. And if actions speak louder than words, it is striking that most advanced economies still maintain significant quantities of gold in their official reserves of foreign currencies and commodity money. By far the largest holders of gold are the United States (over 8000 tonnes, comprising 72 per cent of its total reserves of gold and foreign exchange) and the euro area (10,784 tonnes, accounting for 57 per cent of total reserves). China’s holdings have been rising and are now over 1000 tonnes. By contrast, the United Kingdom has only 310 tonnes (11.6 per cent of total reserves) and Japan 765 tonnes (2.5 per cent of total reserves).43

Gold has the advantage that its supply is not dependent on unpredictable human institutions. Its disadvantage is precisely the same - namely that when a discretionary increase in the supply of money would be advantageous to overcome a sudden panic, gold cannot play that role. The evolution of a framework for the issue of paper money, culminating in the 1990s inflation-targeting regime, showed signs of success. But it is still too early to judge whether democratic societies have managed to create sustainable regimes to manage paper money, avoiding the deflationary impact of fixed-supply commodity money on the one hand, and the dangers of excessive inflation from discretionary control of money supply on the other.

Economists and money

In recent years, many economists have been reluctant to use the word ‘money’. If one is very clever, it is indeed possible to talk about monetary policy without using the word ‘money’. The interesting question is why anyone would want to. The explanation largely lies in a pervasive ideological split between ‘Keynesian’ and ‘monetarist’ economists, which dominated debates on economic policy in the post-war period until inflation had been conquered in the 1990s, but has flared up again with the experience of stagnation since 2008. Monetarists, like Milton Friedman of the University of Chicago, believed that the solution to inflation, and the key to stabilising the economy more generally, was to control the rate of increase of the money supply. Friedman pointed to the collapse of the money supply during the Great Depression and advocated a fixed percentage increase in the money supply each year.44Keynesians believed that fiscal policy was more powerful in controlling the economy, and doubted whether there was a close link between changes in the money supply and movements in the economy. Yet John Maynard Keynes was a monetary economist, and the full title of his magnum opus - published in 1936 and which transformed debates about macroeconomic policy after the Great Depression - is The General Theory of Employment, Interest and Money. Whether monetarist or Keynesian, no economist should ignore the significance of money, even if they disagree about what its role is.

For over two centuries, economists have struggled to provide a rigorous theoretical basis for the role of money, and have largely failed. It is a striking fact that as economics has become more and more sophisticated, it has had less and less to say about money. The apparently obvious idea, articulated by David Hume in the eighteenth century, that the level of prices reflects the balance between the demand for and supply of money has been described by the Nobel Laureate Christopher Sims as ‘obsolete’.45 And even the existence of money has proved something of a mystery for economic theorists. As the eminent Cambridge economist, the late Professor Frank Hahn, wrote: ‘the most serious challenge that the existence of money poses to the theorist is this: the best developed model of the economy cannot find room for it’.46

Why is modern economics unable to explain why money exists? It is the result of a particular view of competitive markets. Adam Smith’s ‘invisible hand’ - the notion that the impersonal forces of competition among a large number of people pursuing their own self-interest would guide resources to activities where they would be used as efficiently as possible - was a beautiful idea. But if the ‘hand’ was invisible, what exactly did it correspond to in the world? For over two hundred years, economists tried to formalise Smith’s proposition and discover under exactly what conditions a competitive market economy would allocate resources efficiently. In the nineteenth century, important contributions came from Frenchman Léon Walras, who taught in Lausanne, and Englishman Alfred Marshall, who taught in Cambridge. Then, in the early 1950s, two economists, Kenneth Arrow and Gerard Debreu, both working in America, finally produced a rigorous explanation of the invisible hand (for which they were subsequently awarded the Nobel Prize).47 They imagined a hypothetical grand auction held at the beginning of time in which bids are made for every possible good and service that people might want to buy or sell at all possible future dates. The process continues until every market has cleared (that is, demand equals supply) with prices, demands and supplies of all goods and services determined in the auction. Life then starts and time unfolds. Because the auction at the beginning of time has done its job, no market needs to reopen in the future. There are, therefore, no further transactions once life starts. Everything has been settled during the initial auction, and all people have to do is to deliver the services, such as employment, for which they have contracted and take delivery of the goods and services that they purchased in the auction. There is no need for something called money to act as either a medium of exchange (the ‘double coincidence of wants’ problem is circumvented by the auction), a store of value (there is no requirement for a reserve of savings), or indeed an absolute standard of value (consumers bidding in the auction need only know the relative prices of different goods and services, including labour). Money has no place in an economy with the grand auction.

Central to the Arrow-Debreu view of the world is a special way of dealing with uncertainty about the future. When bidding in the auction, consumers must bid not only for the good they want - lunch in their favourite Manhattan restaurant, say - and the date on which they want it - next Tuesday - but also for the ‘state of the world’ in which they wish to purchase it. For example, if the restaurant is outdoors you might bid high for a table if next Tuesday were to be sunny and perhaps zero for a table if it were cold or wet. Market-clearing prices are likely to be high in the former ‘state of the world’ and low in the latter. The key point is that all transactions can be made in advance because it is possible, in this theoretical view, to identify all relevant states of the world and make the auction contingent on them. In other words, radical uncertainty is ruled out by assumption.

Obviously, there are many ways in which the world is very far removed from this abstract description, apart from the obvious impracticability of organising the grand auction. Two are of particular importance - the need for institutions to police a market economy, and the nature of uncertainty.

The importance of trust

In the theoretical world of the auction economy, people are assumed to fulfil their previously contracted obligations to work and consume. But some people might be tempted to renege on their obligations - for example, to stay at home and enjoy leisure instead of working.48 So there is a strong motive to find a mechanism or institution whereby contracts may be enforced. In practice we rely heavily on the legal system - hugely expensive though it is - to enforce a wide array of contracts. But we also rely on a mechanism that plays an important role in the economic life of all successful societies - trust. The absence of trust leads to economic inefficiency. As the philosopher Onora O’Neill, Chair of the UK Equality and Human Rights Commission, put it in 2002, ‘It isn’t only rulers and governments who prize and need trust. Each of us and every profession and every institution needs trust. We need it because we have to be able to rely on others acting as they say that they will, and because we need others to accept that we will act as we say we will.’49

Economists mistrust trust. They believe that people will pursue their own self-interest given the incentives they face. Finding a cooperative outcome when confronted with the prisoner’s dilemma inherent in the short-term advantage of reneging on a contract is difficult. Shame, ostracism and loss of honour are all ways in which a society can penalise the individualistic pursuit of self-interest when it leads everyone to be worse off. The creation of a social ethic or code of behaviour is a means of escaping the prisoner’s dilemma.

The consequences of the absence of trust for our ability to exchange goods and services is well illustrated (both in novels such as John le Carré’s Smiley’s People and in reality) by the exchanges of spies during the Cold War. The two sides would approach each other from opposite ends of the Glienicke Bridge, which connected East Germany and West Berlin across the River Havel at Potsdam, meeting in the middle to exchange their prisoners. A classic exchange requiring a ‘double coincidence of wants’, it was an instance of pure barter.

Trust in others can make it possible for one party to deliver goods and services to another at one date and receive an agreed delivery of other goods and services at a later date. Some economists have argued that the role of money is to embody and cement that trust. Imagine a world in which each generation lives for only two periods, and is in turn succeeded by the next generation. Each generation wishes to work in the first period of life and then enjoy retirement in the second. The economic challenge is to ensure that each young generation hands over part of its earned income to the retired older generation, hoping or believing that in turn their children will do the same for them. There is a potentially profitable trade between successive younger and older generations, but one that is difficult to enforce. Money might be a way to solve this problem, by supporting a convention under which the younger generation saves in the form of ‘tokens’ that it carries forward into retirement in order to purchase goods and services from the new younger generation.50 Such tokens, or money, could be called a dollar or 100 dollars, or for that matter a mark, a yen, a buck or a pound. As long as everyone continues to believe that tokens will continue to be acceptable in future, everyone can be better off. The use of money facilitates the trust that is necessary to reach the best possible outcome.51 More generally, our inability to make credible pre-commitments, or to trust each other, explains why ‘evil is the root of all money’, to use the phrase coined by the economists Nobuhiro Kiyotaki and John Moore quoted at the beginning of the chapter.52

The blunt truth, however, is that the implicit intergenerational cooperation that represents the best outcome is supported by trust, not money.53 If the younger generation decides not to support the elder, the existence of tokens will make no difference. And if the older generation has invested in, say, housing, they too could renege on the implicit intergenerational transfer by ‘consuming’ the value of their housing capital by selling it to foreigners or a minority of the wealthy, leaving the young unable to afford to buy the housing stock. That is exactly the intergenerational bargain on which, David Willetts argues, the post-war baby-boomer generation has reneged.54 Trust obviates the need for money, and money without trust has no value. Perhaps it is trust that makes the world go round.

Money and radical uncertainty

The second big difference between the real world and the grand auction is the nature of uncertainty. The auction requires both that we have an exhaustive and complete list of all possible future outcomes so that we can write contracts contingent on all these states, and that we know the probabilities of different outcomes so that we can work out how much to bid for each contingent good or service at each point in the future. But the essence of a capitalist economy is that we cannot imagine today all the new ideas and products that will be discovered in future. If the future is unknowable, then we simply do not know and it is pointless to pretend otherwise.

In 1972, while Joel Grey, as the MC in Cabaret, was singing about money making the world go round, the computer for the whole of Cambridge University was less powerful than that in my smartphone today. The handheld devices we now take for granted were simply unimaginable then. And even if someone had been able to write down a list of possible outcomes that included these developments, I rather doubt that it would have been easy to add to that list all the events that have subsequently occurred in the world, including the fall of the Berlin Wall, the Arab Spring, and other occurrences that were relevant to the profitability of investments and the path of the world economy. Investment is driven by the imagination of individuals who can see opportunities invisible to the rest of us. Their risks are largely uninsurable. Risk-taking by entrepreneurs is an intuitive gamble, not a cool appraisal of expected returns based on a scientific assessment of the probabilities of a known and finite number of possibilities.

How then can people cope with the unimaginable and uninsurable? We may not have a clear idea about the goods and services that we will want to buy in the future, but we know that we need a way to carry forward claims of purchasing power from the present to the future in a form that is generalised in the sense that we do not have to decide today on what we will spend tomorrow. Money gives us the ability to exchange labour today for generalised purchasing power in the future. That is why many savings contracts are denominated in money terms. We do not invest in a bank account that offers us a fixed number of television sets or foreign holidays in the future. We expect to earn an interest rate defined as a percentage increase in the amount of money in our account. Money is not principally a means of buying ‘stuff’ but a way of coping with an uncertain future. We do not know which new goods and services will exist in future, nor what their relative prices will be. There is no auction mechanism today that will allow us to discover that. Maintaining a reserve of purchasing power denominated in a monetary unit reduces the risk from placing one’s eggs in the basket containing only contracts that can be written today. Although we cannot literally insure against the uninsurable, we can try to keep our options open by holding claims on future purchasing power in a general monetary unit of account. Any savings account on which the returns were fixed in money terms would suffice; even a promise of a fixed pension might seem to offer a claim on future purchasing power. But in times of financial stress only money claims issued and guaranteed by government will fully serve the purpose. And in anticipation of the unexpected in a world of radical uncertainty, money does therefore play a special role.

Could a market economy make do without money? It probably could in a simple world where we purchased items for immediate consumption and we all lived in close proximity. In medieval times, village or town markets, often held once a week, played very much that role. But our demand for a growing variety of goods and services outstripped the supply available in a given, small locality long ago. A glance at the Amazon website suggests that we want everything, and we want it now. And the market, whether Amazon or another firm, supplies it. In this more complex world, where people save for, and borrow against, an unknowable future, money plays a special, indeed unique, role. Money is a specific feature of a capitalist economy. Over the centuries, money has evolved from a means of payment designed to circumvent the limitations of a simple barter system to a liquid reserve essential to the operation of a capitalist economy in a real world with an unknowable future.

Money oils the wheels of commerce and finances transactions. There needs to be sufficient money to support the steady expansion of economic activity, but not so much as to generate inflation. Only then can money operate as a credible common measuring rod, whether in the Domesday Book or in modern estimates of gross domestic product. Expanding the amount of money in the economy can be either good or bad, depending on the circumstances. Printing paper money can, as described in Goethe’s famous play Faust, be a stimulus to production when times are bad.55 But the alchemy of money creation fosters the illusion of unbounded pleasure and the temptation to issue so much money in good times that the result is not prosperity but rising inflation, leading to economic chaos and the destruction of prosperity. Few countries suffered more from this pact with the devil than Goethe’s own homeland in the hyperinflation of 1923.

In normal times, a wide range of assets may be accepted as money. In a crisis, central bank money is the ultimate means of payment and store of value. Although gold is unlikely ever to regain its position at the centre of monetary management, it is a store of wealth that is universally acceptable. Other assets, such as bank deposits, which do function as money in good times, may become illiquid as a result of a loss of confidence in their acceptability, and so in a crisis, sufficient ‘emergency money’ needs to be supplied to meet the demand for liquidity. These two roles for money - in ‘good’ and ‘bad’ times - are usually discussed and implemented separately, with the first being seen as ‘monetary policy’ and the second as ‘financial policy’. That compartmentalisation of the different reasons for a central bank to supply money contributed to the failure to understand the evolving problems of the major economies prior to the crisis.

During the twentieth century, governments allowed the creation of money to become the by-product of the process of credit creation. Most money today is created by private sector institutions - banks. This is the most serious fault line in the management of money in our societies today. In his ‘cross of gold’ speech, William Jennings Bryan spoke passionately about the evils of the gold standard - the needs of Main Street should come before those of Wall Street. But almost forgotten are the most important sentences in the speech: ‘We believe that the right to coin money and issue money is a function of government. We believe it is a part of sovereignty and can no more with safety be delegated to private individuals than can the power to make penal statutes or levy laws for taxation … the issue of money is a function of the government and the banks should go out of the governing business.’ He was consciously reiterating Thomas Jefferson, who said in 1809, ‘the issuing power should be taken from the banks and restored to the people, to whom it properly belongs’.56

Why have governments allowed money - a public good - to fall under private control? To answer that, we need to understand the role of banks.