The Intricate Makeshift Money Germans Relied On Between World Wars
by Kelsey Campbell-Dollaghan  /  9.24.13

State-issued currency is the scaffolding upon which capitalism was built, but it’s always been prone to mayhem. For instance in 1920s Germany, extreme inflation forced German businesses to actually print millions of their own customized paper bills. Now largely forgotten, thisnotgeld, or “emergency money,” was once ubiquitous—amounting to an ornately-decorated I.O.U. in Weimar Germany.

Notgeld was a catch-all name for private currency, printed between World War I and World War II in Germany and Austria. There are hundreds—maybe thousands—of unique bills, each created for a specific amount of gold, cash, or even corn and grain. Each printer created (or commissioned) its own design, which ranged from beautiful turn-of-the-century engravings to modernist Bauhaus-inspired typography. The most complete collection of notgeld online comes courtesy of Brooklynite Miguel Oks, whose German ancestors began archiving the bills in the 1930s—thousands of which you can see on his Flickr.

The Intricate Makeshift Money Germans Relied On Between World Wars

So what sparked this proliferation of wildly decorative—and often quite beautiful—emergency currency? There’s a long version and a short version, the latter of which began during World War I, with incredibly rapid inflation spurred by the cost of war. Compounding the problem, the demand for metals used to make weapons and ammunition caused the value of traditional coinage to skyrocket—and soon, banks were printing more and more paper money to make up for the disappearing coins. Even after the Great War ended, strict reparations and a subsequent depression made for even more inflation—this was Mack the Knife-era Weimar, where hunger and unemployment were the norm. Companies were often forced to issue specialized notgeld to pay their employees, simply because the state-run mints couldn’t print enough money to satisfy the demand for bills. So instead, businesses and organizations made their own—and according to Oks, it was often even more stable than conventional bills, since it was tied to gold or another tangible resource.

The Intricate Makeshift Money Germans Relied On Between World Wars

Fascinatingly, there was also a financial logic to the elaborate decorations that grace many of these bills. Miguel Oks explains:

They made it very pretty on purpose: many people collected the bills, and the debt would never have to be paid. Many were specifically made for collecting, they were called “Serienscheine”, and special albums were sold for the specific purpose of organizing and displaying them. They were printed on all kinds of materials: leather, fabric, porcelain, silk, tin foil…

So the decorations on notgeld bills weren’t just “of their time.” They were actually calculated attempts to create collector’s items—which would thus never be turned in for actual compensation. Of course, financial instability—and all the social ills that came with it—would play a huge role in the rise of National Socialism. If you look closely, the designs on some of these bills speak to the earliest inklings of Nazi ideology, too, from wounded German soldiers to Germanic mythological figures—innocuous signals of darker times ahead. But they also offer a fascinating glimpse into the life and times of this hard-fought era. Check out some of the voluminous collection below.

The Intricate Makeshift Money Germans Relied On Between World Wars

The Intricate Makeshift Money Germans Relied On Between World Wars

The Intricate Makeshift Money Germans Relied On Between World Wars

The Intricate Makeshift Money Germans Relied On Between World Wars

The Intricate Makeshift Money Germans Relied On Between World Wars

The Intricate Makeshift Money Germans Relied On Between World Wars

The Intricate Makeshift Money Germans Relied On Between World Wars

The Intricate Makeshift Money Germans Relied On Between World Wars

The Intricate Makeshift Money Germans Relied On Between World Wars

The Intricate Makeshift Money Germans Relied On Between World Wars

The Intricate Makeshift Money Germans Relied On Between World Wars

The Intricate Makeshift Money Germans Relied On Between World Wars

The Intricate Makeshift Money Germans Relied On Between World Wars

His entire collection, consisting of over 5,500 notes can be viewed on Flickr


“After 800 years of life in the same region, part of my family left Germany. In 1935 Nazism had become unlivable and the danger too clear. They were lucky enough to understand the risk it was for Jews living in Germany and they left. Until then, they had been part of a comfortable and prosperous middle class, involved in the tobacco business in the city of Karlsruhe. The collection was started by our ancestor when he noticed that Notgeld was not the norm but the exception in the history of currencies. He started collecting Notgeld produced by many German and Austrian towns and companies to make front to deflation first and inflation later with the objective of providing stability to workers and residents. Notgeld (emergency currency) was issued by cities, boroughs, even private companies while there was a shortage of official coins and bills. Nobody would pay in coins while their nominal value was less than the value of the metal. And when inflation went on, the state was just unable to print bills fast enough. Some companies couldn’t pay their workers because the Reichsbank just couldn’t provide enough bills. So they started to print their own money – they even asked the Reichsbank beforehand. As long as the Notgeld was accepted, no real harm was done and it just was a certificate of debt. Often it was even a more stable currency than real money, as sometimes the denomination was a certain amount of gold, dollars, corn, meat, etc.

They made it very pretty on purpose: many people collected the bills, and the debt would never have to be paid. Many were specifically made for collecting, they were called “Serienscheine”, and special albums were sold for the specific purpose of organizing and displaying them. They were printed on all kinds of materials: leather, fabric, porcelain, silk, tin foil… It was not legal tender, so the only people who dealt in it were those that wanted to. It was very stable and debt free. To keep it flowing, sometimes it was set up to lose 2 or 3% of its value every month, which kept people from hoarding it. There were several advantages to issuing Notgeld. First, it stabilized local government and local markets, so people could sell and buy what they needed and government services kept functioning. Second, it was a stabilizing influence on the real currency, which was still used. And third, it helped to concentrate the real currency at the government level, so they could import things not found locally. It was a controlled complementary currency, so prices were set by whoever issued it. In effect, this created wide scale and orderly rationing.

At a personal level, my interest in these notes lies in the fact that everyone one of these pieces of paper carries the seed of the development of twentieth century artistic and political movements. These artistic and ideological movements still influence our thinking and inform our consciousness, our taste and every aspect of our life. I cannot but shiver at the emergence of National Socialism palpable in the art of many of these notes. I admire the level of craftsmanship and obsession that characterizes this nation. But when looking at these virtues in a historical context we see what they have come to mean for our civilization. And it allows me not to forget the consequences in a positive light. I still buy Notgeld occasionally; I have about 5,500 notes, about 125,000 different ones were minted, so I don’t expect ever to have a complete collection. Every once in a while I open the binders where they are stored and enjoy the designs.”


Civil War Tokens: Value Me As You Please  /  December 20, 2011

“During the US Civil War, metal monies were hoarded for their value, resulting in a shortage of available coins. The Union government issued official “paper coins” that weren’t backed by by gold or silver. This “faith paper” lost value quickly, and for a short while, stamps were official currency. That didn’t take, either, so enterprising individuals took it upon themselves to mint their own coinage. These are now known as Civil War Tokens (CTWs), and were made and used between late 1862 and mid 1864. On April 22, 1864, Congress set the weight of coins and set punishment for counterfeiting coins of up to one thousand dollars and imprisonment up to five yearsYet there are over ten thousand varieties of tokens, representing 22 states, 400 towns and about 1500 individual merchants. Melvin and his son Dr. George Fuld wrote key books in the CWT field, creating the rarity scale and composition key used by most numismatists. Given sheer number of CWTs, starting a collection might be daunting. Enter collector Ken Bauer, whose method breaks down the vast world into smaller collections, from anvils to watches and so much more.”



The shadow banking system is vastly bigger than regulators thought / September 17, 2013

In most parts of the world, the banking system is closely regulated and monitored by central banks and other government agencies. That’s just as it should be, you might think. But banks have a way round this kind of regulation. For the last decade or so, it has become common practice for banks to do business in ways that don’t show up on conventional balance sheets. Before the 2008 financial crisis, for example, many investment banks financed mortgages in this way. To all intents and purposes, these transactions are invisible to regulators. This so-called shadow banking system is huge and important. Indeed, many economists blame activities that took place in the shadow banking system for the 2008 crash. But the size of the system is hard to measure because of its hidden and impenetrable nature. But today, Davide Fiaschi , an economist at the University of Pisa in Italy, and a couple of pals reveal a powerful and simple way of determining the size of the shadow banking system. Their conclusions are revealing. They say that the shadow banking system is vastly bigger than anyone had imagined before. And although its size dropped dramatically after the financial crisis in 2008, it has since grown dramatically and is today significantly bigger than it was even then. Perhaps the biggest problem with measuring the shadow banking system is that nobody quite knows how to define it. Economists say it includes activities such as hedge funds, private equity funds credit insurance providers and so on. But there is significant debate over where to draw the line.

The de facto arbiter of this question is the Financial Stability Board set up in 1999 by the Group of Seven developed nations. It estimates the size of the shadow banking system each year by adding up all the transactions that fall outside mainstream regulation, or at least as much of this as it can see. The Board estimated the size of the shadow banking system to be just over $60 trillion in 2007, the year before the great financial crash. This figure dropped a little in 2008 but rose again to $67 trillion in 2011. That’s more than the total GDP of the 25 countries from which the figures are obtained. Now Fiaschi and co say the Financial Stability Board has severely underestimated the total. These guys have developed an entirely different way of calculating its size using the emerging discipline of econophysics. These guys begin with empirical observation that when economists plot the distribution of companies by size, the result is a power law. In other words, there are vastly many more small companies then there are large ones and the difference is measured in powers of 10. So not 2 or 3 or 4 times as many but 100 (10^2), 1000 (10^3) or 10,000 (10^4) times as many. These kinds of power laws are ubiquitous in the real world. They describe everything from the size distribution of cities, websites and even casualties in war. That’s not really surprising. A power law is always the result when things grow according to a process known as preferential attachment, or in common parlance, the rich-get-richer effect. In economic terms, big businesses grow faster than smaller ones, perhaps because people are more likely to work with big established companies. Whatever the reason, it is a well observed effect. Except in the financial sector. Fiaschi and co say that this power law accurately governs the distribution of small and medium-sized companies in the financial world. But when it comes to the largest financial companies, the law breaks down. For example, the UK’s Royal Bank Of Scotland is the 12th largest firm on the planet with assets of $2.13 trillion. If the size of these firms followed a power law, the largest would be ten times bigger than the 10th on the list. But that isn’t the case. But world’s largest, Fannie Mae, has assets worth $3.2 trillion, just 50% larger than the Royal Bank of Scotland. Why the discrepancy? Fiaschi and co hypothesise that the difference is equal to the size of the shadow banking system, which is not captured in the balance sheets of the largest financial firms. And if that’s the case, it’s straightforward to calculate its size. The value of the shadow banking system is simply the difference between the value of the largest financial firms and their projected size according to the power law. By this measure, the shadow banking system is significantly bigger than previously thought. Fiaschi and co estimate that in 2007, the year before the financial crisis, it was worth around $90 trillion. This fell to about $70 trillion in 2008 but has since risen sharply to be worth around $100 trillion in 2012.

This new Shadow Banking Index has significant advantages over conventional ways of calculating its size. “This index is based on simple and robust statistical features, that are expected to characterize the collective behavior of an economy,” says Fiaschi and co. That’s useful because the growing complexity of the financial markets makes them hard to measure directly. Fiaschi and co point out that any detailed description and classification of financial activity is unlikely to keep pace with the rate of innovation in the financial industry. So the new Shadow Banking Index looks to be an important step towards the proper and meaningful oversight of an industry that is hugely valuable and important and yet increasingly complex and renegade. Of course, there is an 800lb gorilla in the room. That’s how these financial companies come to be so huge in the first place. The global economy is dominated by financial firms. On the Forbes Global 2000 list of the world’s largest companies, the first non-financial firm is General Electric, which ranks 44th. How can that be? If it isn’t evidence that something is rotten in the state of Denmark, then it’s hard to imagine what would constitute such proof. The size and impenetrability of the shadow banking system is clearly part of the problem so an index that can measure it quickly and easily is a useful step in the right direction.

The Interrupted Power Law And The Size Of Shadow Banking

Does the Federal Reserve really control the money supply?
by John Aziz  /  May 30, 2013

The Federal Reserve, which issues the United States’ monetary base (bank notes, coins, and bank reserves), has vastly increased its size since 2008 through quantitative easing programs — buying assets including Treasury bonds and mortgage-backed securities in open market operations with newly-created money:

(Federal Reserve Bank of St. Louis)

With such soaring quantities of new money resulting from quantitative easing, many economists including John WilliamsPeter Schiff, and Marc Faber have predicted imminent high or hyper inflation. But by the broadest measures, like MIT’s Billion Prices Project, the predictions haven’t played out. Inflation hasn’t soared along with the monetary base. And the money supply, which is different from the monetary base (the actual currency), hasn’t really grown either. Bank notes and coins are the most tangible kind of dollars, but there are many more kinds of things called “dollars” that are used for exchange. Most obviously, credit. Banks create money through the fractional reserve banking system. Banks can lend — and thus create credit — up to 10 times their reserves on hand. This means that while the monetary base has tripled, the M2 money stock — which includes both checking and savings accounts as well as traveler’s checks, time deposits, and money market deposit accounts — has not increased nearly as much:

(Federal Reserve Bank of St. Louis)

But even M2 does not encompass the entirety of the money supply. There exists another banking system —the shadow banking system— where credit expansion also takes place. Shadow credit creation takes place via securitization, a process by which debt-based assets like mortgages, credit card debt, and auto debt are pooled together and sold, and via repo, through which assets are pawned to a lender as collateral for credit. One of the stories behind the 2008 crisis was the huge outgrowth of shadow credit creation that preceded it. Yet when the crisis hit, credit markets became spooked, shadow credit creation dried up, and the level of shadow assets began to deflate:

(Federal Reserve Bank of St. Louis)

So the hidden story behind the quantitative easing programs is that the new base money that the Fed has pushed into the financial system has been replacing shadow credit that dried up after 2008. The Fed does not control the money supply — most of the money supply has been created through credit. The Fed can only control one small part of the money supply. This is shown in this chart of M4 — the total money supply, including shadow money — created by Professor Steve Hanke of the Cato Institute, with the monetary base issued by the Fed in olive:

(Cato Institute)

Even after all that quantitative easing, the money supply has still shrunk. In fact, quantitative easing may be choking off shadow credit creation. As the Fed buys more and more assets, there are less assets left in the market that can be used as collateral for credit creation. This so-called “safe-asset shortage” is one factor that has driven the price of Treasuries as well as corporate bonds and even junk debt to record highs. If choking off shadow credit creation and replacing shadow money with traditional money was the Fed’s implicit goal, then it is succeeding. But the money the Fed has issued since the crisis hasn’t even made up for the shrinkage.



Lars Schall: Dr. Hudes, let’s talk about the World Bank, which is often described as a “Bretton Woods organization,” since it was officially founded at the famous international conference in Bretton Woods, New Hampshire in 1944. However, the plan to establish this bank (and the International Monetary Fund) originated years before with the highly secretive “War and Peace Studies” that were conducted by the Council on Foreign Relations and the US State Department, while the money for the study came from the Rockefeller Foundation. (1) Given this background of being part of the “Grand Area” design and strategy for the post-war world order, isn’t the World Bank really a tool to exercise American hegemony?
Karen Hudes: I take issue with one part of that question – when you say, “American hegemony.” If you unbundle the political structure inside the United States, it’s not what you see is what you get. It’s not that the American citizens are the ones that are running the country. There is a very wealthy group that is secretly, through domination of the press, trying to keep the citizens in the United States in the dark. And so when you say a tool of “American hegemony,” the answer is it is a tool of hegemony but I would take the “American” out of the equation. What you saw in the last presidential election was massive amounts of foreign money coming in, in an attempt to influence voters. (2) That’s the group that I’m talking about and I would be very happy, as a sidebar at some point, to discuss who that group is, where they are, and what they’re doing. Because I didn’t know about that group when I started on my saga, but I found out about them later on. Now I try to tell them that they have to start behaving themselves. They are not above the law; they think they are, but they are subject to the law.

L.S.: Okay, then let us go straight down to the nitty-gritty: Can you name the individuals and institutions of that group?
K.H.: I’ll tell you what I can do; I can point you to a very good study that was done by three systems theorists at the Swiss Federal Institute of Technology in Zurich, ranked as the best university in continental Europe. What they did was examine the interlocking ownership of the world’s 43,000 transnational corporations using mathematical modeling tools. Are you familiar with that study?

L.S.: Yes, I am. I believe you are referring to a study which showed basically that a small group respectively “super-entity” of 147 financial institutions and multinational corporations is pretty much in control of the world economy. (3)
K.H.: Yes, that’s right. So, it’s whoever is behind that group which is in control of 1 percent of the investments but that 1 percent through corporate interlocking directorships is now in control of 40 percent of the assets and 60 percent of the revenues of this set of 43,060 transnational companies. That’s who that group is. Now, do I know who the individuals behind that group are? They’re very good at secretly hiding, so I’m not going to hazard a guess. But once we get the legal machinery in place, we will find out in great detail who these individuals are, and they will be playing by the rules along with everybody else on this planet.


The Armageddon Looting Machine: The Looming Mass Destruction from Derivatives
by Ellen Brown  /  September 17, 2013

Five years after the financial collapse precipitated by the Lehman Brothers bankruptcy on September 15, 2008, the risk of another full-blown financial panic is still looming large, despite the Dodd Frank legislation designed to contain it. As noted in a recent Reuters article, the risk has just moved into the shadows:

[B]anks are pulling back their balance sheets from the fringes of the credit markets, with more and more risk being driven to unregulated lenders that comprise the $60 trillion “shadow-banking” sector.

Increased regulation and low interest rates have made lending to homeowners and small businesses less attractive than before 2008. The easy subprime scams of yesteryear are no more. The void is being filled by the shadow banking system. Shadow banking comes in many forms, but the big money today is in repos and derivatives. The notional (or hypothetical) value of the derivatives market has been estimated to be as high as $1.2 quadrillion, or twenty times the GDP of all the countries of the world combined.

According to Hervé Hannoun, Deputy General Manager of the Bank for International Settlements, investment banks as well as commercial banks may conduct much of their business in the shadow banking system (SBS), although most are not generally classed as SBS institutions themselves. At least one financial regulatory expert has said that regulated banking organizations are the largest shadow banks.

The Hidden Government Guarantee that Props Up the Shadow Banking System
According to Dutch economist Enrico Perotti, banks are able to fund their loans much more cheaply than any other industry because they offer “liquidity on demand.” The promise that the depositor can get his money out at any time is made credible by government-backed deposit insurance and access to central bank funding.  But what guarantee underwrites the shadow banks? Why would financial institutions feel confident lending cheaply in the shadow market, when it is not protected by deposit insurance or government bailouts? Perotti says that liquidity-on-demand is guaranteed in the SBS through another, lesser-known form of government guarantee: “safe harbor” status in bankruptcy. Repos and derivatives, the stock in trade of shadow banks, have “superpriority” over all other claims. Perotti writes:

Security pledging grants access to cheap funding thanks to the steady expansion in the EU and US of “safe harbor status”. Also called bankruptcy privileges, this ensures lenders secured on financial collateral immediate access to their pledged securities. . . . Safe harbor status grants the privilege of being excluded from mandatory stay, and basically all other restrictions. Safe harbor lenders, which at present include repos and derivative margins, can immediately repossess and resell pledged collateral. This gives repos and derivatives extraordinary super-priority over all other claims, including tax and wage claims, deposits, real secured credit and insurance claims. Critically, it ensures immediacy (liquidity) for their holders. Unfortunately, it does so by undermining orderly liquidation.

When orderly liquidation is undermined, there is a rush to get the collateral, which can actually propel the debtor into bankruptcy. The amendment to the Bankruptcy Reform Act of 2005 that created this favored status for repos and derivatives was pushed through by the banking lobby with few questions asked. In a December 2011 article titled “Plan B – How to Loot Nations and Their Banks Legally,” documentary film-maker David Malone wrote:

This amendment which was touted as necessary to reduce systemic risk in financial bankruptcies . . . allowed a whole range of far riskier assets to be used . . . . The size of the repo market hugely increased and riskier assets were gladly accepted as collateral because traders saw that if the person they had lent to went down they could get [their] money back before anyone else and no one could stop them.

Burning Down the Barn to Get the Insurance
Safe harbor status creates the sort of perverse incentives that make derivatives “financial weapons of mass destruction,” as Warren Buffett famously branded them. It is the equivalent of burning down the barn to collect the insurance. Says Malone:

All other creditors – bond holders – risk losing some of their money in a bankruptcy. So they have a reason to want to avoid bankruptcy of a trading partner. Not so the repo and derivatives partners. They would now be best served by looting the company – perfectly legally – as soon as trouble seemed likely. In fact the repo and derivatives traders could push a bank that owed them money over into bankruptcy when it most suited them as creditors. When, for example, they might be in need of a bit of cash themselves to meet a few pressing creditors of their own.

The collapse of . . . Bear Stearns, Lehman Brothers and AIG were all directly because repo and derivatives partners of those institutions suddenly stopped trading and ‘looted’ them instead.

The global credit collapse was triggered, it seems, not by wild subprime lending but by the rush to grab collateral by players with congressionally-approved safe harbor status for their repos and derivatives. Bear Stearns and Lehman Brothers were strictly investment banks, but now we have giant depository banks gambling in derivatives as well; and with the repeal of the Glass-Steagall Act that separated depository and investment banking, they are allowed to commingle their deposits and investments. The risk to the depositors was made glaringly obvious when MF Global went bankrupt in October 2011. Malone wrote:

When MF Global went down it did so because its repo, derivative and hypothecation partners essentially foreclosed on it. And when they did so they then ‘looted’ the company. And because of the co-mingling of clients money in the hypothecation deals the ‘looters’ also seized clients money as well. . . JPMorgan allegedly has MF Global money while other people’s lawyers can only argue about it.

MF Global was followed by the Cyprus “bail-in” – the confiscation of depositor funds to recapitalize the country’s failed banks. This was followed by the coordinated appearance of bail-in templates worldwide, mandated by the Financial Stability Board, the global banking regulator in Switzerland.


The Auto-Destruct Trip Wire on the Banking System
Bail-in policies are being necessitated by the fact that governments are balking at further bank bailouts. In the US, the Dodd-Frank Act (Section 716) now bans taxpayer bailouts of most speculative derivative activities. That means the next time we have a Lehman-style event, the banking system could simply collapse into a black hole of derivative looting. Malone writes:

. . . The bankruptcy laws allow a mechanism for banks to disembowel each other. The strongest lend to the weaker and loot them when the moment of crisis approaches. The plan allows the biggest banks, those who happen to be burdened with massive holdings of dodgy euro area bonds, to leap out of the bond crisis and instead profit from a bankruptcy which might otherwise have killed them. All that is required is to know the import of the bankruptcy law and do as much repo, hypothecation and derivative trading with the weaker banks as you can. … I think this means that some of the biggest banks, themselves, have already constructed and greatly enlarged a now truly massive trip wired auto-destruct on the banking system.

The weaker banks may be the victims, but it is we the people who will wind up holding the bag. Malone observes:

For the last four years who has been putting money in to the banks? And who has become a massive bond holder in all the banks? We have. First via our national banks and now via the Fed, ECB and various tax payer funded bail out funds. We are the bond holders who would be shafted by the Plan B looting. We would be the people waiting in line for the money the banks would have already made off with. . . .

. . . [T]he banks have created a financial Armageddon looting machine. Their Plan B is a mechanism to loot not just the more vulnerable banks in weaker nations, but those nations themselves. And the looting will not take months, not even days. It could happen in hours if not minutes.

Crisis and Opportunity: Building a Better Mousetrap
There is no way to regulate away this sort of risk. If both the conventional banking system and the shadow banking system are being maintained by government guarantees, then we the people are bearing the risk. We should be directing where the credit goes and collecting the interest. Banking and the creation of money-as-credit need to be made public utilities, owned by the public and having a mandate to serve the public. Public banks do not engage in derivatives. Today, virtually the entire circulating money supply (M1, M2 and M3) consists of privately-created “bank credit” – money created on the books of banks in the form of loans. If this private credit system implodes, we will be without a money supply. One option would be to return to the system of government-issued money that was devised by the American colonists, revived by Abraham Lincoln during the Civil War, and used by other countries at various times and places around the world. Another option would be a system of publicly-owned state banks on the model of the Bank of North Dakota, leveraging the capital of the state backed by the revenues of the into public bank credit for the use of the local economy. Change happens historically in times of crisis, and we may be there again today.



How the Black Death Spawned the Minimum Wage
by Stephen Mihm  /  Sep 5, 2013

Fast-food joints, long inhospitable to any kind of labor activism, are suddenly beset by a surge in strikes. Over the past few months, workers at chains such as McDonalds Corp. have walked off the job in more than 60 cities, demanding a “living wage” of $15 an hour. Regardless of whether the strikes lead to better pay, they have rekindled debate over what constitutes a living wage. That debate, however, has stranger, older and more curious origins than either proponents or detractors of the living wage might imagine.

The story begins in medieval England in the 14th century. Life, never particularly easy at this time in history, had become especially nasty, brutish and short. The preceding year, the “Great Pestilence,” better known as the Black Death, had arrived in continental Europe. The pandemic, one contemporary noted, “began in India and, raging through the whole of infidel Syria and Egypt,” reached England in 1349, “where the same mortality destroyed more than a third of the men, women and children.” Once the dead had been buried, feudal society was shaken to its core by a startling realization. As this same chronicler complained, “there was such a shortage of servants, craftsmen, and workmen, and of agricultural workers and labourers, that a great many lords and people … were yet without all service and attendance.” Survivors could now command much higher compensation for their work, and they weren’t shy about asking for it: “The humble turned up their noses at employment, and could scarcely be persuaded to serve the eminent unless for triple wages.” In response, King Edward III — a wealthy landowner who was as dependent on serfs as his many lords — issued the “Ordinance of Labourers,” which put a ceiling on how much workers could charge for their labor, setting wages at pre-plague levels. Subsequent amendments of the law — for example, the Statute of Labourers in 1351 — amplified the penalties for paying above set rates.

These laws effectively set what we would call a maximum wage. But the measures reflected something a bit more complicated than an attempt to stick it to the serfs. They embodied a distinctly medieval belief that one’s earnings should be commensurate with one’s station in life. The Catholic theologian Thomas Aquinas wrote that a man’s “external riches” must be sufficient “for him to live in keeping with his condition in life.” Anything less was cruel; anything more was an enticement to sin and a threat to the social order. As the historian Kevin Blackburn has convincingly argued, while laws governing wages initially set a ceiling on compensation, they were ultimately used to set a living wage, arguably as early as 1389, when an amendment to the Statute of Labourers effectively pegged wages to the price of food.

As the centuries passed, the justices of the peace charged with setting maximum wages appear to have begun setting formal minimum wages as well, though the evidence is fragmentary. Nonetheless, the practice eventually gained statutory recognition with the passage of an “Act Fixing a Minimum Wage,” issued in 1604 during the reign of James I and aimed at workers in the textile industry. The idea of encumbering wages with either an upper or lower limit would soon fall victim to the liberalizing tendencies of an increasingly capitalistic England. By the early 19th century, the Statutes of Labourers had been repealed. But the argument over wages didn’t disappear. As labor unrest swept many industrial nations in the 19th century, the concept of the minimum wage or living wage resurfaced in unexpected places.

The first was the Vatican. In 1891, Pope Leo XIII offered a distinctly medieval take on the labor question. In his Rerum Novarum, the pontiff called for the passage of laws to remove “the causes which lead to conflicts between employers and [the] employed.” Foremost among those causes, he averred, was the insufficiency of wages. “To defraud any one of wages that are his due is a great crime which cries to the avenging anger of Heaven,” he declared. But there was an easier way to solve the problem than involving the Almighty. Instead, the pope counseled the revival of the medieval living wage, arguing that the compensation of a wage earner should be sufficient “to support a frugal and well-behaved wage-earner.” The encyclical resonated in nations that had high numbers of both Catholics and aggrieved workers. Among these was Australia, which along with New Zealand would become a cradle of the modern minimum wage movement. In the 1890s, Australian Catholics began agitating for the implementation of a living wage. The year of the encyclical, Australian Cardinal Patrick Francis Moran called for wages sufficient to “yield a competence … for the frugal support of [a worker’s] wife and family.”

The first genuine minimum wage laws were established in the states of Victoria (1894) and New South Wales (1895). They dictated that unskilled workers employed by the government be paid a living wage of seven shillings a day.As one legislator declared in 1894, “the workers should have a rate of payment which would enable them to maintain themselves and their families in decent comfort.” In the succeeding years, support for minimum wage legislation grew. Catholic reformers continued to revive the medieval idea of a living wage. Foremost among these figures was Henry Bournes Higgins, the presiding judge in the Commonwealth Court of Conciliation and Arbitration.

In 1907, Higgins heard a case involving the Sunshine Harvester Works, the largest manufacturer of farming implements in Australia. Under a newly passed law, the company would have to pay a significant tax unless it could prove that it paid its workers “fair and reasonable” wages. The law didn’t set those wages; it was up to the court to decide whether Harvester met that threshold. Higgins rejected the company’s claims that it paid reasonable wages. More important, Higgins declared that the court had the right to a set a national minimum wage in the private sector, and he did: seven shillings a day for those working at unskilled labor. Higgins declared that a living wage must be sufficient to provide a “reasonable and frugal comfort.” As Blackburn observed, Higgins effectively “secularized the living wage,” reviving a medieval concept for modern times.

Though Harvester managed to get the decision reversed by a higher court, the opinion quickly became iconic. Higgins and his judicial allies managed to secure widespread acceptance of the idea of a national minimum wage through other opinions. The minimum wage was here to stay. Australia soon became a kind of Mecca for reformers elsewhere, who made the pilgrimage to study these and other innovations firsthand. When reformers in the U.S. proposed a minimum wage to drive wages up, they immediately went Down Under.

Why a medieval peasant got more vacation time than you
by Lynn Parramore  /  AUGUST 29, 2013

Life for the medieval peasant was certainly no picnic. His life was shadowed by fear of famine, disease and bursts of warfare. His diet and personal hygiene left much to be desired. But despite his reputation as a miserable wretch, you might envy him one thing: his vacations. Plowing and harvesting were backbreaking toil, but the peasant enjoyed anywhere from eight weeks to half the year off. The Church, mindful of how to keep a population from rebelling, enforced frequent mandatory holidays. Weddings, wakes and births might mean a week off quaffing ale to celebrate, and when wandering jugglers or sporting events came to town, the peasant expected time off for entertainment. There were labor-free Sundays, and when the plowing and harvesting seasons were over, the peasant got time to rest, too. In fact, economist Juliet Shor found that during periods of particularly high wages, such as 14th-century England, peasants might put in no more than 150 days a year.

As for the modern American worker? After a year on the job, she gets an average of eight vacation days annually. It wasn’t supposed to turn out this way: John Maynard Keynes, one of the founders of modern economics, made a famous prediction that by 2030, advanced societies would be wealthy enough that leisure time, rather than work, would characterize national lifestyles. So far, that forecast is not looking good. What happened? Some cite the victory of the modern eight-hour a day, 40-hour workweek over the punishing 70 or 80 hours a 19th century worker spent toiling as proof that we’re moving in the right direction. But Americans have long since kissed the 40-hour workweek goodbye, and Shor’s examination of work patterns reveals that the 19th century was an aberration in the history of human labor. When workers fought for the eight-hour workday, they weren’t trying to get something radical and new, but rather to restore what their ancestors had enjoyed before industrial capitalists and the electric lightbulb came on the scene. Go back 200, 300 or 400 years and you find that most people did not work very long hours at all. In addition to relaxing during long holidays, the medieval peasant took his sweet time eating meals, and the day often included time for an afternoon snooze. “The tempo of life was slow, even leisurely; the pace of work relaxed,” notes Shor. “Our ancestors may not have been rich, but they had an abundance of leisure.”

Fast-forward to the 21st century, and the U.S. is the only advanced country with no national vacation policy whatsoever. Many American workers must keep on working through public holidays, and vacation days often go unused. Even when we finally carve out a holiday, many of us answer emails and “check in” whether we’re camping with the kids or trying to kick back on the beach. Some blame the American worker for not taking what is her due. But in a period of consistently high unemployment, job insecurity and weak labor unions, employees may feel no choice but to accept the conditions set by the culture and the individual employer. In a world of “at will” employment, where the work contract can be terminated at any time, it’s not easy to raise objections.

It’s true that the New Deal brought back some of the conditions that farm workers and artisans from the Middle Ages took for granted, but since the 1980s things have gone steadily downhill. With secure long-term employment slipping away, people jump from job to job, so seniority no longer offers the benefits of additional days off. The rising trend of hourly and part-time work, stoked by the Great Recession, means that for many, the idea of a guaranteed vacation is a dim memory. Ironically, this cult of endless toil doesn’t really help the bottom line. Study after study shows that overworking reduces productivity. On the other hand, performance increases after a vacation, and workers come back with restored energy and focus. The longer the vacation, the more relaxed and energized people feel upon returning to the office. Economic crises give austerity-minded politicians excuses to talk of decreasing time off, increasing the retirement age and cutting into social insurance programs and safety nets that were supposed to allow us a fate better than working until we drop. In Europe, where workers average 25 to 30 days off per year, politicians like French President Francois Hollande and Greek Prime Minister Antonis Samaras are sending signals that the culture of longer vacations is coming to an end. But the belief that shorter vacations bring economic gains doesn’t appear to add up. According to the Organisation for Economic Co-operation and Development (OECD) the Greeks, who face a horrible economy, work more hours than any other Europeans. In Germany, an economic powerhouse, workers rank second to last in number of hours worked. Despite more time off, German workers are the eighth most productive in Europe, while the long-toiling Greeks rank 24 out of 25 in productivity.

Beyond burnout, vanishing vacations make our relationships with families and friends suffer. Our health is deteriorating: depression and higher risk of death are among the outcomes for our no-vacation nation. Some forward-thinking people have tried to reverse this trend, like progressive economist Robert Reich, who has argued in favor of a mandatory three weeks off for all American workers. Congressman Alan Grayson proposed the Paid Vacation Act of 2009, but alas, the bill didn’t even make it to the floor of Congress. Speaking of Congress, its members seem to be the only people in America getting as much down time as the medieval peasant. They get 239 days off this year.

How Poverty Taxes the Brain
by Emily Badger  /  Aug 29, 2013

Human mental bandwidth is finite. You’ve probably experienced this before (though maybe not in those terms): When you’re lost in concentration trying to solve a problem like a broken computer, you’re more likely to neglect other tasks, things like remembering to take the dog for a walk, or picking your kid up from school. This is why people who use cell phones behind the wheel actually perform worse as drivers. It’s why air traffic controllers focused on averting a mid-air collision are less likely to pay attention to other planes in the sky. We only have so much cognitive capacity to spread around. It’s a scarce resource. This understanding of the brain’s bandwidth could fundamentally change the way we think about poverty. Researchers publishing some groundbreaking findings today in the journal Science have concluded that poverty imposes such a massive cognitive load on the poor that they have little bandwidth left over to do many of the things that might lift them out of poverty – like go to night school, or search for a new job, or even remember to pay bills on time. The condition of poverty imposed a mental burden akin to losing 13 IQ points.

In a series of experiments run by researchers at Princeton, Harvard, and the University of Warwick, low-income people who were primed to think about financial problems performed poorly on a series of cognition tests, saddled with a mental load that was the equivalent of losing an entire night’s sleep. Put another way, the condition of poverty imposed a mental burden akin to losing 13 IQ points, or comparable to the cognitive difference that’s been observed between chronic alcoholics and normal adults. The finding further undercuts the theory that poor people, through inherent weakness, are responsible for their own poverty – or that they ought to be able to lift themselves out of it with enough effort. This research suggests that the reality of poverty actually makes it harder to execute fundamental life skills. Being poor means, as the authors write, “coping with not just a shortfall of money, but also with a concurrent shortfall of cognitive resources.” This explains, for example, why poor people who aren’t good with money might also struggle to be good parents. The two problems aren’t unconnected. “It’s the same bandwidth,” says Princeton’s Eldar Shafir, one of the authors of the study alongside Anandi ManiSendhil Mullainathan, and Jiaying Zhao. Poor people live in a constant state of scarcity (in this case, scarce mental bandwidth), a debilitating environment that Shafir and Mullainathan describe in a book to be published next week, Scarcity: Why having too little means so much. What Shafir and his colleagues have identified is not exactly stress. Rather, poverty imposes something else on people that impedes them even when biological markers of stress (like elevated heart rates and blood pressure) aren’t present. Stress can also positively affect us in small quantities. An athlete under stress, for example, may actually perform better. Stress follows a kind of classic curve: a little bit can help, but beyond a certain point, too much of it will harm us. This picture of cognitive bandwidth looks different. To study it, the researchers performed two sets of experiments. In the first, about 400 randomly chosen people in a New Jersey mall were asked how they would respond to a scenario where their car required either $150 or $1,500 in repairs. Would they pay for the work in full, take out of a loan, or put off the repair? How would they make that decision? The subjects varied in annual income from $20,000 to $70,000. Before responding, the subjects were given a series of common tests (identifying sequences of shapes and numbers, for example) measuring cognitive function and fluid intelligence. In the easier scenario, where the hypothetical repair cost only $150, subjects classified as “poor” and “rich” performed equally well on these tests. But the “poor” subjects performed noticeably worse in the $1,500 scenario. Simply asking these people to think about financial problems taxed their mental bandwidth. “And these are not people in abject poverty,” Shafir says. “These are regular folks going to the mall that day.”

The “rich” subjects in the study experienced no such difficulty. In the second experiment, the researchers found similar results when working with a group of farmers in India who experience a natural annual cycle of poverty and plenty. These farmers receive 60 percent of their annual income in one lump sum after the sugarcane harvest. Beforehand, they are essentially poor. Afterward (briefly), they’re not. In the state of pre-harvest poverty, however, they exhibited the same shortage of cognitive bandwidth seen in the American subjects in a New Jersey mall. The design of these experiments wasn’t particularly groundbreaking, which makes it all the more astounding that we’ve never previously understood this connection between cognition and poverty. “This project, there’s nothing new in it, there’s no new technology, this could have been done years ago,” Shafir says. But the work is the product of the relatively new field of behavioral economics. Previously, cognitive psychologists seldom studied the differences between different socio-economic populations (“a brain is a brain, a head is a head,” Shafir says). Meanwhile, other psychology and economics fields were studying different populations but not cognition. Now that all of these perspectives have come together, the implications for how we think about poverty – and design programs for people impacted by it – are enormous. Solutions that make financial life easier for poor people don’t simply change their financial prospects. When a poor person receives a regular direct-deposited paycheck every Friday, that does more than simply relieve the worry over when money will come in next. “When we do that, we liberate some bandwidth,” Shafir says. Policymakers tend to evaluate the success of financial programs aimed at the poor by measuring how they do financially. “The interesting thing about this perspective is that it says if I make your financial life easier, if I give you more bandwidth, what I really ought to look at is how you’re doing in your life. You might be doing better parenting. You might be adhering to your medication better.”

The limited bandwidth created by poverty directly impacts the cognitive control and fluid intelligence that we need for all kinds of everyday tasks. “When your bandwidth is loaded, in the case of the poor,” Shafir says, “you’re just more likely to not notice things, you’re more likely to not resist things you ought to resist, you’re more likely to forget things, you’re going to have less patience, less attention to devote to your children when they come back from school.” At the macro level, this means we lost an enormous amount of cognitive ability during the recession. Millions of people had less bandwidth to give to their children, or to remember to take their medication. Conversely, going forward, this also means that anti-poverty programs could have a huge benefit that we’ve never recognized before: Help people become more financially stable, and you also free up their cognitive resources to succeed in all kinds of other ways as well. For all the value in this finding, it’s easy to imagine how proponents of hackneyed arguments about poverty might twist the fundamental relationship between cause-and-effect here. If living in poverty is the equivalent of losing 13 points in IQ, doesn’t that mean people with lower IQs wind up in poverty? “We’ve definitely worried about that,” Shafir says. Science, though, is coalescing around the opposite explanation. “All the data shows it isn’t about poor people, it’s about people who happen to be in poverty. All the data suggests it is not the person, it’s the context they’re inhabiting.”

Nathan Yau’s data visualization maps the food deserts in the United States. 

Violent behavior linked to nutritional deficiencies  /  03 Sep 2013

Deficiencies of vitamins A, D, K, B1, B3, B6, B12 and folate, and of minerals iodine, potassium, iron, magnesium, zinc, chromium and manganese can all contribute to mental instability and violent behavior, according to a report published in the Spring 2013 issue of Wise Traditions, the journal of the Weston A. Price Foundation. The article, Violent Behavior: A Solution in Plain Sight by Sylvia Onusic, PhD, CNS, LDN, seeks reasons for the increase in violent behavior in America, especially among teenagers. “We can blame violence on the media and on the breakdown of the home,” says Onusic, “but the fact is that a large number of Americans, living mostly on devitalized processed food, are suffering from malnutrition. In many cases, this means their brains are starving.”

In fact, doctors are seeing a return of nutritional deficiency diseases such as scurvy and pellagra, which were declared eradicated long ago by public health officials. Many of these conditions cause brain injuries as well. Symptoms of pellagra, for example, include anxiety, hyperactivity, depression, fatigue, headache, insomnia and hallucinations. Pellagra is a disease caused by deficiency of vitamin B3. Zinc deficiency is linked with angry, aggressive, and hostile behaviors that result in violence. The best dietary sources of zinc are red meat and shellfish. Leaky gut and gluten sensitivities may exacerbate nutrient deficiencies. Gluten intolerance is strongly linked with schizophrenia. “Making things worse are excitotoxins so prevalent in the food supply, such as MSG and Aspartame,” says Onusic. “People who live on processed food and who drink diet sodas are exposed to these mind-altering chemicals at very high levels.” In an effort to curb child obesity, the dairy industry recently petitioned FDA to include aspartame and other artificial sweeteners in dairy beverages featured in school lunches, without appropriate labeling. Recent research has established the fact that aspartame actually leads to weigh gain because of its effect on insulin. Other ingredients in the food supply linked to violent behavior include sugar, artificial colors and flavorings, caffeine, alcohol and soy foods. The toxic environmental burden includes mercury, arsenic, lead, fire retardants, pesticides, heavy metals and Teflon. Adding psychiatric drugs to this mix puts everyone at risk. “The only solution to the mounting levels of violence is a return to real, nutrient-dense food,” says Sally Fallon Morell, president of the Weston A. Price Foundation. “We must create a culture in which eating processed food is seen as uncool, and in which home cooking is embraced as a life-enhancing skill.” The Weston A. Price Foundation has pointed out the poor nutritional quality of school lunches and the flaws in the USDA dietary guidelines, which schools receiving federal funding are required to follow. At a press conference in January, 2010, the Foundation proposed guidelines that include eggs, organ meats and healthy animal fats. “Our brains need cholesterol to function properly,” said Fallon Morell, “and our children need cholesterol-rich food for optimal mental and emotional development.” Studies have shown that depressed individuals, offenders who show the most violent behavior, and the most violent suicides have low cholesterol levels.

{Jay Smith/DISCOVER}

Grandma’s Experiences Leave a Mark on Your Genes
Your ancestors’ lousy childhoods or excellent adventures might change your personality, bequeathing anxiety or resilience by altering the epigenetic expressions of genes in the brain.
by Dan Hurley  /  June 11, 2013

Darwin and Freud walk into a bar. Two alcoholic mice — a mother and her son — sit on two bar stools, lapping gin from two thimbles. The mother mouse looks up and says, “Hey, geniuses, tell me how my son got into this sorry state.” “Bad inheritance,” says Darwin. “Bad mothering,” says Freud. For over a hundred years, those two views — nature or nurture, biology or psychology — offered opposing explanations for how behaviors develop and persist, not only within a single individual but across generations. And then, in 1992, two young scientists following in Freud’s and Darwin’s footsteps actually did walk into a bar. And by the time they walked out, a few beers later, they had begun to forge a revolutionary new synthesis of how life experiences could directly affect your genes — and not only your own life experiences, but those of your mother’s, grandmother’s and beyond. The bar was in Madrid, where the Cajal Institute, Spain’s oldest academic center for the study of neurobiology, was holding an international meeting. Moshe Szyf, a molecular biologist and geneticist at McGill University in Montreal, had never studied psychology or neurology, but he had been talked into attending by a colleague who thought his work might have some application. Likewise, Michael Meaney, a McGill neurobiologist, had been talked into attending by the same colleague, who thought Meaney’s research into animal models of maternal neglect might benefit from Szyf’s perspective. “I can still visualize the place — it was a corner bar that specialized in pizza,” Meaney says. “Moshe, being kosher, was interested in kosher calories. Beer is kosher. Moshe can drink beer anywhere. And I’m Irish. So it was perfect.” The two engaged in animated conversation about a hot new line of research in genetics. Since the 1970s, researchers had known that the tightly wound spools of DNA inside each cell’s nucleus require something extra to tell them exactly which genes to transcribe, whether for a heart cell, a liver cell or a brain cell.

One such extra element is the methyl group, a common structural component of organic molecules. The methyl group works like a placeholder in a cookbook, attaching to the DNA within each cell to select only those recipes — er, genes — necessary for that particular cell’s proteins. Because methyl groups are attached to the genes, residing beside but separate from the double-helix DNA code, the field was dubbed epigenetics, from the prefix epi (Greek for over, outer, above). Originally these epigenetic changes were believed to occur only during fetal development. But pioneering studies showed that molecular bric-a-brac could be added to DNA in adulthood, setting off a cascade of cellular changes resulting in cancer. Sometimes methyl groups attached to DNA thanks to changes in diet; other times, exposure to certain chemicals appeared to be the cause. Szyf showed that correcting epigenetic changes with drugs could cure certain cancers in animals. Geneticists were especially surprised to find that epigenetic change could be passed down from parent to child, one generation after the next. A study from Randy Jirtle of Duke University showed that when female mice are fed a diet rich in methyl groups, the fur pigment of subsequent offspring is permanently altered. Without any change to DNA at all, methyl groups could be added or subtracted, and the changes were inherited much like a mutation in a gene. Now, at the bar in Madrid, Szyf and Meaney considered a hypothesis as improbable as it was profound: If diet and chemicals can cause epigenetic changes, could certain experiences — child neglect, drug abuse or other severe stresses — also set off epigenetic changes to the DNA inside the neurons of a person’s brain? That question turned out to be the basis of a new field, behavioral epigenetics, now so vibrant it has spawned dozens of studies and suggested profound new treatments to heal the brain. According to the new insights of behavioral epigenetics, traumatic experiences in our past, or in our recent ancestors’ past, leave molecular scars adhering to our DNA. Jews whose great-grandparents were chased from their Russian shtetls; Chinese whose grandparents lived through the ravages of the Cultural Revolution; young immigrants from Africa whose parents survived massacres; adults of every ethnicity who grew up with alcoholic or abusive parents — all carry with them more than just memories. Like silt deposited on the cogs of a finely tuned machine after the seawater of a tsunami recedes, our experiences, and those of our forebears, are never gone, even if they have been forgotten. They become a part of us, a molecular residue holding fast to our genetic scaffolding. The DNA remains the same, but psychological and behavioral tendencies are inherited. You might have inherited not just your grandmother’s knobby knees, but also her predisposition toward depression caused by the neglect she suffered as a newborn. Or not. If your grandmother was adopted by nurturing parents, you might be enjoying the boost she received thanks to their love and support. The mechanisms of behavioral epigenetics underlie not only deficits and weaknesses but strengths and resiliencies, too. And for those unlucky enough to descend from miserable or withholding grandparents, emerging drug treatments could reset not just mood, but the epigenetic changes themselves. Like grandmother’s vintage dress, you could wear it or have it altered. The genome has long been known as the blueprint of life, but the epigenome is life’s Etch A Sketch: Shake it hard enough, and you can wipe clean the family curse.

Voodoo Genetics
Twenty years after helping to set off a revolution, Meaney sits behind a wide walnut table that serves as his desk. A January storm has deposited half a foot of snow outside the picture windows lining his fourth-floor corner office at the Douglas Institute, a mental health affiliate of McGill. He has the rugged good looks and tousled salt-and-pepper hair of someone found on a ski slope — precisely where he plans to go this weekend. On the floor lays an arrangement of helium balloons in various stages of deflation. “Happy 60th!” one announces. “I’ve always been interested in what makes people different from each other,” he says. “The way we act, the way we behave — some people are optimistic, some are pessimistic. What produces that variation? Evolution selects the variance that is most successful, but what produces the grist for the mill?” Meaney pursued the question of individual differences by studying how the rearing habits of mother rats caused lifelong changes in their offspring. Research dating back to the 1950s had shown that rats handled by humans for as little as five to 15 minutes per day during their first three weeks of life grew up to be calmer and less reactive to stressful environments compared with their non-handled littermates. Seeking to tease out the mechanism behind such an enduring effect, Meaney and others established that the benefit was not actually conveyed by the human handling. Rather, the handling simply provoked the rats’ mothers to lick and groom their pups more, and to engage more often in a behavior called arched-back nursing, in which the mother gives the pups extra room to suckle against her underside. “It’s all about the tactile stimulation,” Meaney says. In a landmark 1997 paper in Science, he showed that natural variations in the amount of licking and grooming received during infancy had a direct effect on how stress hormones, including corticosterone, were expressed in adulthood. The more licking as babies, the lower the stress hormones as grown-ups. It was almost as if the mother rats were licking away at a genetic dimmer switch. What the paper didn’t explain was how such a thing could be possible.  “What we had done up to that point in time was to identify maternal care and its influence on specific genes,” Meaney says. “But epigenetics wasn’t a topic I knew very much about.” And then he met Szyf.

Postnatal Inheritance
“I was going to be a dentist,” Szyf says with a laugh. Slight, pale and balding, he sits in a small office at the back of his bustling laboratory — a room so Spartan, it contains just a single picture, a photograph of two embryos in a womb. Needing to write a thesis in the late 1970s for his doctorate in dentistry at Hebrew University of Jerusalem, Szyf approached a young biochemistry professor named Aharon Razin, who had recently made a splash by publishing his first few studies in some of the world’s top scientific journals. The studies were the first to show that the action of genes could be modulated by structures called methyl groups, a subject about which Szyf knew precisely nothing. But he needed a thesis adviser, and Razin was there. Szyf found himself swept up to the forefront of the hot new field of epigenetics and never looked back. Until researchers like Razin came along, the basic story line on how genes get transcribed in a cell was neat and simple. DNA is the master code, residing inside the nucleus of every cell; RNA transcribes the code to build whatever proteins the cell needs. Then some of Razin’s colleagues showed that methyl groups could attach to cytosine, one of the chemical bases in DNA and RNA. It was Razin, working with fellow biochemist Howard Cedar, who showed these attachments weren’t just brief, meaningless affairs. The methyl groups could become married permanently to the DNA, getting replicated right along with it through a hundred generations. As in any good marriage, moreover, the attachment of the methyl groups significantly altered the behavior of whichever gene they wed, inhibiting its transcription, much like a jealous spouse. It did so, Razin and Cedar showed, by tightening the thread of DNA as it wrapped around a molecular spool, called a histone, inside the nucleus. The tighter it is wrapped, the harder to produce proteins from the gene. Consider what that means: Without a mutation to the DNA code itself, the attached methyl groups cause long-term, heritable change in gene function. Other molecules, called acetyl groups, were found to play the opposite role, unwinding DNA around the histone spool, and so making it easier for RNA to transcribe a given gene. By the time Szyf arrived at McGill in the late 1980s, he had become an expert in the mechanics of epigenetic change. But until meeting Meaney, he had never heard anyone suggest that such changes could occur in the brain, simply due to maternal care. “It sounded like voodoo at first,” Szyf admits. “For a molecular biologist, anything that didn’t have a clear molecular pathway was not serious science. But the longer we talked, the more I realized that maternal care just might be capable of causing changes in DNA methylation, as crazy as that sounded. So Michael and I decided we’d have to do the experiment to find out.”

Actually, they ended up doing a series of elaborate experiments. With the assistance of postdoctoral researchers, they began by selecting mother rats who were either highly attentive or highly inattentive. Once a pup had grown up into adulthood, the team examined its hippocampus, a brain region essential for regulating the stress response. In the pups of inattentive mothers, they found that genes regulating the production of glucocorticoid receptors, which regulate sensitivity to stress hormones, were highly methylated; in the pups of conscientious moms, the genes for the glucocorticoid receptors were rarely methylated. Methylation just gums up the works. So the less the better when it comes to transcribing the affected gene. In this case, methylation associated with miserable mothering prevented the normal number of glucocorticoid receptors from being transcribed in the baby’s hippocampus. And so for want of sufficient glucocorticoid receptors, the rats grew up to be nervous wrecks. To demonstrate that the effects were purely due to the mother’s behavior and not her genes, Meaney and colleagues performed a second experiment. They took rat pups born to inattentive mothers and gave them to attentive ones, and vice versa. As they predicted, the rats born to attentive mothers but raised by inattentive ones grew up to have low levels of glucocorticoid receptors in their hippocampus and behaved skittishly. Likewise, those born to bad mothers but raised by good ones grew up to be calm and brave and had high levels of glucocorticoid receptors.

Before publishing their findings, Meaney and Szyf conducted a third crucial experiment, hoping to overwhelm the inevitable skeptics who would rise up to question their results. After all, it could be argued, what if the epigenetic changes observed in the rats’ brains were not directly causing the behavioral changes in the adults, but were merely co-occurring? Freud certainly knew the enduring power of bad mothers to screw up people’s lives. Maybe the emotional effects were unrelated to the epigenetic change. To test that possibility, Meaney and Szyf took yet another litter of rats raised by rotten mothers. This time, after the usual damage had been done, they infused their brains with trichostatin A, a drug that can remove methyl groups. These animals showed none of the behavioral deficits usually seen in such offspring, and their brains showed none of the epigenetic changes. “It was crazy to think that injecting it straight into the brain would work,” says Szyf. “But it did. It was like rebooting a computer.” Despite such seemingly overwhelming evidence, when the pair wrote it all up in a paper, one of the reviewers at a top science journal refused to believe it, stating he had never before seen evidence that a mother’s behavior could cause epigenetic change. “Of course he hadn’t,” Szyf says. “We wouldn’t have bothered to report the study if it had already been proved.” In the end, their landmark paper, “Epigenetic programming by maternal behavior,” was published in June 2004 in the journal Nature Neuroscience. Meaney and Szyf had proved something incredible. Call it postnatal inheritance: With no changes to their genetic code, the baby rats nonetheless gained genetic attachments due solely to their upbringing — epigenetic additions of methyl groups sticking like umbrellas out the elevator doors of their histones, gumming up the works and altering the function of the brain.

The Beat Goes On
Together, Meaney and Szyf have gone on to publish some two-dozen papers, finding evidence along the way of epigenetic changes to many other genes active in the brain. Perhaps most significantly, in a study led by Frances Champagne — then a graduate student in Meaney’s lab, now an associate professor with her own lab at Columbia University in New York — they found that inattentive mothering in rodents causes methylation of the genes for estrogen receptors in the brain. When those babies grow up, the resulting decrease of estrogen receptors makes them less attentive to their babies. And so the beat goes on. As animal experiments continue apace, Szyf and Meaney have entered into the next great step in the study of behavioral epigenetics: human studies. In a 2008 paper, they compared the brains of people who had committed suicide with the brains of people who had died suddenly of factors other than suicide. They found excess methylation of genes in the suicide brains’ hippocampus, a region critical to memory acquisition and stress response. If the suicide victims had been abused as children, they found, their brains were more methylated. Why can’t your friend “just get over” her upbringing by an angry, distant mother? Why can’t she “just snap out of it”? The reason may well be due to methyl groups that were added in childhood to genes in her brain, thereby handcuffing her mood to feelings of fear and despair. Of course, it is generally not possible to sample the brains of living people. But examining blood samples in humans is routine, and Szyf has gone searching there for markers of epigenetic methylation. Sure enough, in 2011 he reported on a genome-wide analysis of blood samples taken from 40 men who participated in a British study of people born in England in 1958. All the men had been at a socioeconomic extreme, either very rich or very poor, at some point in their lives ranging from early childhood to mid-adulthood. In all, Szyf analyzed the methylation state of about 20,000 genes. Of these, 6,176 genes varied significantly based on poverty or wealth. Most striking, however, was the finding that genes were more than twice as likely to show methylation changes based on family income during early childhood versus economic status as adults.

Timing, in other words, matters. Your parents winning the lottery or going bankrupt when you’re 2 years old will likely affect the epigenome of your brain, and your resulting emotional tendencies, far more strongly than whatever fortune finds you in middle age. Last year, Szyf and researchers from Yale University published another study of human blood samples, comparing 14 children raised in Russian orphanages with 14 other Russian children raised by their biological parents. They found far more methylation in the orphans’ genes, including many that play an important role in neural communication and brain development and function. “Our study shows that the early stress of separation from a biological parent impacts long-term programming of genome function; this might explain why adopted children may be particularly vulnerable to harsh parenting in terms of their physical and mental health,” said Szyf’s co-author, psychologist Elena Grigorenko of the Child Study Center at Yale. “Parenting adopted children might require much more nurturing care to reverse these changes in genome regulation.” A case study in the epigenetic effects of upbringing in humans can be seen in the life of Szyf’s and Meaney’s onetime collaborator, Frances Champagne. “My mom studied prolactin, a hormone involved in maternal behavior. She was a driving force in encouraging me to go into science,” she recalls. Now a leading figure in the study of maternal influence, Champagne just had her first child, a daughter. And epigenetic research has taught her something not found in the What to Expect books or even her mother’s former lab. “The thing I’ve gained from the work I do is that stress is a big suppressor of maternal behavior,” she says. “We see it in the animal studies, and it’s true in humans. So the best thing you can do is not to worry all the time about whether you’re doing the right thing. Keeping the stress level down is the most important thing. And tactile interaction — that’s certainly what the good mother rats are doing with their babies. That sensory input, the touching, is so important for the developing brain.”

The Mark Of Cain
The message that a mother’s love can make all the difference in a child’s life is nothing new. But the ability of epigenetic change to persist across generations remains the subject of debate. Is methylation transmitted directly through the fertilized egg, or is each infant born pure, a methylated virgin, with the attachments of methyl groups slathered on solely by parents after birth? Neuroscientist Eric Nestler of the Icahn School of Medicine at Mount Sinai in New York has been seeking an answer for years. In one study, he exposed male mice to 10 days of bullying by larger, more aggressive mice. At the end of the experiment, the bullied mice were socially withdrawn. To test whether such effects could be transmitted to the next generation, Nestler took another group of bullied mice and bred them with females, but kept them from ever meeting their offspring. Despite having no contact with their depressed fathers, the offspring grew up to be hypersensitive to stress. “It was not a subtle effect; the offspring were dramatically more susceptible to developing signs of depression,” he says. In further testing, Nestler took sperm from defeated males and impregnated females through in vitro fertilization. The offspring did not show most of the behavioral abnormalities, suggesting that epigenetic transmission may not be at the root. Instead, Nestler proposes, “the female might know she had sex with a loser. She knows it’s a tainted male she had sex with, so she cares for her pups differently,” accounting for the results. Despite his findings, no consensus has yet emerged. The latest evidence, published in the Jan. 25 issue of the journal Science, suggests that epigenetic changes in mice are usually erased, but not always. The erasure is imperfect, and sometimes the affected genes may make it through to the next generation, setting the stage for transmission of the altered traits in descendants as well.

What’s Next?
The studies keep piling on. One line of research traces memory loss in old age to epigenetic alterations in brain neurons. Another connects post-traumatic stress disorder to methylation of the gene coding for neurotrophic factor, a protein that regulates the growth of neurons in the brain. If it is true that epigenetic changes to genes active in certain regions of the brain underlie our emotional and intellectual intelligence — our tendency to be calm or fearful, our ability to learn or to forget — then the question arises: Why can’t we just take a drug to rinse away the unwanted methyl groups like a bar of epigenetic Irish Spring? The hunt is on. Giant pharmaceutical and smaller biotech firms are searching for epigenetic compounds to boost learning and memory. It has been lost on no one that epigenetic medications might succeed in treating depression, anxiety and post-traumatic stress disorder where today’s psychiatric drugs have failed. But it is going to be a leap. How could we be sure that epigenetic drugs would scrub clean only the dangerous marks, leaving beneficial — perhaps essential — methyl groups intact? And what if we could create a pill potent enough to wipe clean the epigenetic slate of all that history wrote? If such a pill could free the genes within your brain of the epigenetic detritus left by all the wars, the rapes, the abandonments and cheated childhoods of your ancestors, would you take it?



“A policeman tries to extinguish a fire on a man after he set himself ablaze outside a bank branch in Thessaloniki in northern Greece September 16, 2011. The 55-year old man had entered the bank and asked for a renegotiation of his overdue loan payments on his home and business, according to police, which he could not pay, but was refused by the bank.”

by Jon Henley /  15 May 2013

The austerity programmes administered by western governments in the wake of the 2008 global financial crisis were, of course, intended as a remedy, a tough but necessary course of treatment to relieve the symptoms of debts and deficits and to cure recession. But if, David Stuckler says, austerity had been run like a clinical trial, “It would have been discontinued. The evidence of its deadly side-effects – of the profound effects of economic choices on health – is overwhelming.” Stuckler speaks softly, in the measured tones and carefully weighed terms of the academic, which is what he is: a leading expert on the economics of health, masters in public health degree from Yale, PhD from Cambridge, senior research leader at Oxford, 100-odd peer-reviewed papers to his name. But his message – especially here, as even the IMF starts to question chancellor George Osborne’s enthusiasm for ever-deeper budget cuts – is explosive, backed by a decade of research, and based on reams of publicly available data: “Recessions,” Stuckler says bluntly, “can hurt. But austerity kills.”

In a powerful new book, The Body Economic, Stuckler and his colleague Sanjay Basu, an assistant professor of medicine and epidemiologist at Stanford University, show that austerity is now having a “devastating effect” on public health in Europe and North America. The mass of data they have mined reveals that more than 10,000 additional suicides and up to a million extra cases of depression have been recorded across the two continents since governments started introducing austerity programmes in the aftermath of the crisis. In the United States, more than five million Americans have lost access to healthcare since the recession began, essentially because when they lost their jobs, they also lost their health insurance. And in the UK, the authors say, 10,000 families have been pushed into homelessness following housing benefit cuts. The most extreme case, says Stuckler, reeling off numbers he knows now by heart, is Greece. “There, austerity to meet targets set by the troika is leading to a public-health disaster,” he says. “Greece has cut its health system by more than 40%. As the health minister said: ‘These aren’t cuts with a scalpel, they’re cuts with a butcher’s knife.'” Worse, those cuts have been decided “not by doctors and healthcare professionals, but by economists and financial managers. The plan was simply to get health spending down to 6% of GDP. Where did that number come from? It’s less than the UK, less than Germany, way less than the US.”

The consequences have been dramatic. Cuts in HIV-prevention budgets have coincided with a 200% increase in the virus in Greece, driven by a sharp rise in intravenous drug use against the background of a youth unemployment rate now running at more than 50% and a spike in homelessness of around a quarter. The World Health Organisation, Stuckler says, recommends a supply of 200 clean needles a year for each intravenous drug user; groups that work with users in Athens estimate the current number available is about three. In terms of “economic” suicides, “Greece has gone from one extreme to the other. It used to have one of Europe’s lowest suicide rates; it has seen a more than 60% rise.” In general, each suicide corresponds to around 10 suicide attempts and – it varies from country to country – between 100 and 1,000 new cases of depression. In Greece, says Stuckler, “that’s reflected in surveys that show a doubling in cases of depression; in psychiatry services saying they’re overwhelmed; in charity helplines reporting huge increases in calls”. The country’s healthcare system itself has also “signally failed to manage or cope with the threats it’s facing”, Stuckler notes. “There have been heavy cuts to many hospital sectors. Places lack surgical gloves, the most basic equipment. More than 200 medicines have been destocked by pharmacies who can’t pay for them. When you cut with the butcher’s knife, you cut both fat and lean. Ultimately, it’s the patient who loses out.” Such phenomena, he says, “are just a few of many effects we’re seeing. And with all this accumulation of across-the-board, eye-watering statistics, there’s a cause-and-effect relationship with austerity measures. These issues became apparent not when the recession hit Greece, but with austerity.” But public health disasters such as Greece’s are not inevitable, even in the very worst economic downturns. Stuckler and Basu began to look at this before the crisis hit, studying how large personal economic shocks – unemployment, loss of your home, unpayable debt – “literally could get under people’s skin, and cause serious health problems”.

The pair examined data from major economic upsets in the past: the Great Depression in the US; post-communist Russia’s brutal transition to a market economy; Sweden’s banking crisis in the early 1990s; the East-Asian debacle later that decade; Germany’s painful labour market reforms early this century. “We were looking,” Stuckler says, “at how rises in unemployment, which is one indicator of recession, affected people’s health. We found that suicides tended to rise. We wanted to see if there was a way these suicides could be prevented.” It rapidly became clear “there was enormous variation across countries”, he says. “In some countries, politicians managed the consequences of recession well, preventing rising suicides and depression. In others, there was a very close relationship between ups and downs in the economy and peaks and valleys in suicides.” Investment in intensive programmes to help people return to work – so-called Active Labour Market Programmes, well developed in Sweden (where suicides actually fell during the banking crisis) but also effective in Germany – were a factor that seemed to make a big difference. Maintaining spending on broader social protection and welfare programmes helped, too: analysis of data from the 1930s Great Depression in the US showed that every extra $100 per capita of relief in states that adopted the American New Deal led to about 20 fewer deaths per 1,000 births, four fewer suicides per 100,000 people and 18 fewer pneumonia deaths per 100,000 people. “When this recession started, we began to see history repeat itself,” says Stuckler. “In Spain, for example, where there was little investment in labour programmes, we saw a spike in suicides. In Finland, Iceland, countries that took steps to protect their people in hard times, there was no noticeable impact on suicide rates or other health problems. “So I think we really noticed these harms aren’t inevitable back in 2008 or 2009, early in the recession. We realised that what ultimately happens in recessions depends, essentially, on how politicians respond to them.” Poorer public health, in other words, is not an inevitable consequence of economic downturns, it amounts to a political choice – by the government of the country concerned or, in the case of the southern part of the eurozone, by the EU, European Central Bank and IMF troika.

transcript (RT)

Stuckler seizes on Iceland as an example of “an alternative. It suffered the worst banking crisis in history; all three of its biggest banks failed, its total debt jumped to 800% of GDP – far worse than what any European country faces today, relative to the size of its economy. And under pressure from public protests, its president put how to deal with the crisis to a vote. Some 93% of the population voted against paying for the bankers’ recklessness with large cuts to their health and social-protection systems.” And what happened? Under Iceland’s universal healthcare system, “no one lost access to care. In fact more money went into the system. We saw no rise in suicides or depressive disorders – and we looked very hard. People consumed more locally sourced fish, so diets have improved. And by 2011, Iceland, which was previously ranked the happiest society in the world, was top of that list again.” What also bugs Stuckler – an economist as well as a public-health expert – is that neither Iceland nor any other country that “protected its people when they needed it most” did so at the cost of economic recovery. “It didn’t break them to invest in programmes to help people get back to work,” he says, “or to save people from homelessness. Iceland now is booming; unemployment fell back to below 5% and GDP growth is above 4% – far exceeding any of other European countries that suffered major recessions.” Countries such as those in Scandinavia that took what Stuckler terms “wise, cost-effective and affordable steps that can make a difference” have seen the impact reflected not just in improved health statistics, but also in their economies. Which is why, occasionally, the austerity argument angers him. “If there actually was a fundamental trade-off between the health of the economy and public health, maybe there would be a real debate to be had,” he says. “But there isn’t. Investing in programmes that protect the nation’s health is not only the right thing to do, it can help spur economic recovery. We show that. The data shows that.” Drilling into the data shows the fiscal multiplier – the economic bang, if you like, per government buck spent, or cost per buck cut – for spending on healthcare, education and social protection is many times greater than that for money ploughed into, for example, bank bailouts or defence spending. “That,” says Stuckler, “seems to me essential knowledge if you want to minimise the economic damage, to understand which cuts will be the least harmful to the economy. But if you look at the pattern of the cuts that have happened, it’s been the exact opposite.” So in this current economic crisis, there are countries – Iceland, Sweden, Finland – that are showing positive health trends, and there are countries that are not: Greece, Spain, now maybe Italy. Teetering between the two extremes, Stuckler reckons, is Britain. The UK, he says, is “one of the clearest expressions of how austerity kills”. Suicides were falling in this country before the recession, he notes. Then, coinciding with a surge in unemployment, they spiked in 2008 and 2009. As unemployment dipped again in 2009 and 2010, so too did suicides. But since the election and the coalition government’s introduction of austerity measures – and particularly cuts in public sector jobs across the country – suicides are back.

Ministers seem unwilling to address the increase in suicides, arguing it is too early to conclude anything from the data. Stuckler points out that this is because the Department of Health prefers to use three-year rolling averages that even out annual fluctuations. But based on the actual data, he is in no doubt. “We’ve seen a second wave – of austerity suicides,” he says. “And they’ve been concentrated in the north and north-east, places like Yorkshire and Humber, with large rises in unemployment. Whereas London … We’re now seeing polarisation across the UK in mental-health issues.” He cites, also, the dire impact on homelessness – falling in Britain until 2010 – of government cuts to social housing budgets, and the human tragedies triggered by the fitness-for-work evaluations, designed to weed out disability benefit fraud. “What’s so particularly tragic about those,” he says, “is that the government’s own estimates of fraud by persons with disabilities is less than the sum of the contract awarded to the company carrying out the tests.” At least, though, no one in the UK has been denied access to healthcare – yet. Stuckler confesses to being “heartbroken” as what he sees happening to the NHS. “Britain stood out as the great protector of its people’s health in this recession,” he says. “By all measures – public satisfaction, quality, access – the UK was at or near the top, and at very low relative cost.” But that, he says, is now changing. “I don’t know if people quite realise how fundamental this government’s transformation of the NHS is,” he says. “And once it’s in place, it will be difficult, if not impossible, to reverse. We haven’t yet seen here what can happen when people are denied access to healthcare, but the US system gives us a pretty clear warning.” He finds this all in stark and depressing contrast to the post-second world war period, when Britain’s debt was more than 200% of GDP (far higher than any European country’s today, bar Iceland) and the country’s leaders responded not by cutting spending but by founding the welfare state – “paving the way, incidentally, for decades of prosperity. And within 10 years, debt had halved.”

The Body Economic should come as a broadside, morally armour-plated and data-reinforced. The austerity debate, Stuckler says, is “a public discussion that needs to be held. Politicians talk endlessly about debts and deficits, but without regard to the human cost of their decisions.” What its authors hope is that politicians will take the message they have uncovered in the data seriously, and start basing policy on evidence rather than ideology. (Some already do. When Stuckler and Basu presented some of their findings in the Swedish parliament, the MPs’ response was: “Why are you telling us this? We know it. It’s why we set up these programmes.” Others, notably in Greece, have sought to divert responsibility.) “Our book,” says Stuckler, “shows that the cost of austerity can be calculated in human lives. It articulates how austerity kills. It shows austerity and health is always a false economy – no matter how positively some people view it, because for them it shrinks the role of the state, or reduces payments into a system they never use anyway.” When times are hard, governments need to invest more – or, at the very least, cut where it does least harm. It is dangerous and economically damaging to cut vital supports at a time when people need them most. “So there is an opportunity here,” Stuckler concludes, “to make a lasting difference. To set our economies on track for a happier, healthier future, as we did in the postwar period. To get our priorities as a society right. It’s not yet too late. Almost, but not quite.”

by David Stuckler & Sanjay Basu / May 12, 2013

Early last month, a triple suicide was reported in the seaside town of Civitanova Marche, Italy. A married couple, Anna Maria Sopranzi, 68, and Romeo Dionisi, 62, had been struggling to live on her monthly pension of around 500 euros (about $650), and had fallen behind on rent. Because the Italian government’s austerity budget had raised the retirement age, Mr. Dionisi, a former construction worker, became one of Italy’s esodati (exiled ones) — older workers plunged into poverty without a safety net. On April 5, he and his wife left a note on a neighbor’s car asking for forgiveness, then hanged themselves in a storage closet at home. When Ms. Sopranzi’s brother, Giuseppe Sopranzi, 73, heard the news, he drowned himself in the Adriatic. The correlation between unemployment and suicide has been observed since the 19th century. People looking for work are about twice as likely to end their lives as those who have jobs. In the United States, the suicide rate, which had slowly risen since 2000, jumped during and after the 2007-9 recession. In a new book, we estimate that 4,750 “excess” suicides — that is, deaths above what pre-existing trends would predict — occurred from 2007 to 2010. Rates of such suicides were significantly greater in the states that experienced the greatest job losses. Deaths from suicide overtook deaths from car crashes in 2009. If suicides were an unavoidable consequence of economic downturns, this would just be another story about the human toll of the Great Recession. But it isn’t so. Countries that slashed health and social protection budgets, like Greece, Italy and Spain, have seen starkly worse health outcomes than nations like Germany, Iceland and Sweden, which maintained their social safety nets and opted for stimulus over austerity. (Germany preaches the virtues of austerity — for others.) As scholars of public health and political economy, we have watched aghast as politicians endlessly debate debts and deficits with little regard for the human costs of their decisions. Over the past decade, we mined huge data sets from across the globe to understand how economic shocks — from the Great Depression to the end of the Soviet Union to the Asian financial crisis to the Great Recession — affect our health. What we’ve found is that people do not inevitably get sick or die because the economy has faltered. Fiscal policy, it turns out, can be a matter of life or death. At one extreme is Greece, which is in the middle of a public health disaster. The national health budget has been cut by 40 percent since 2008, partly to meet deficit-reduction targets set by the so-called troika —  the International Monetary Fund, the European Commission and the European Central Bank — as part of a 2010 austerity package. Some 35,000 doctors, nurses and other health workers have lost their jobs. Hospital admissions have soared after Greeks avoided getting routine and preventive treatment because of long wait times and rising drug costs. Infant mortality rose by 40 percent. New H.I.V. infections more than doubled, a result of rising intravenous drug use — as the budget for needle-exchange programs was cut. After mosquito-spraying programs were slashed in southern Greece, malaria cases were reported in significant numbers for the first time since the early 1970s.

In contrast, Iceland avoided a public health disaster even though it experienced, in 2008, the largest banking crisis in history, relative to the size of its economy. After three main commercial banks failed, total debt soared, unemployment increased ninefold, and the value of its currency, the krona, collapsed. Iceland became the first European country to seek an I.M.F. bailout since 1976. But instead of bailing out the banks and slashing budgets, as the I.M.F. demanded, Iceland’s politicians took a radical step: they put austerity to a vote. In two referendums, in 2010 and 2011, Icelanders voted overwhelmingly to pay off foreign creditors gradually, rather than all at once through austerity. Iceland’s economy has largely recovered, while Greece’s teeters on collapse. No one lost health care coverage or access to medication, even as the price of imported drugs rose. There was no significant increase in suicide. Last year, the first U.N. World Happiness Report ranked Iceland as one of the world’s happiest nations. Skeptics will point to structural differences between Greece and Iceland. Greece’s membership in the euro zone made currency devaluation impossible, and it had less political room to reject I.M.F. calls for austerity. But the contrast supports our thesis that an economic crisis does not necessarily have to involve a public health crisis. Somewhere between these extremes is the United States. Initially, the 2009 stimulus package shored up the safety net. But there are warning signs — beyond the higher suicide rate — that health trends are worsening. Prescriptions for antidepressants have soared. Three-quarters of a million people (particularly out-of-work young men) have turned to binge drinking. Over five million Americans lost access to health care in the recession because they lost their jobs (and either could not afford to extend their insurance under the Cobra law or exhausted their eligibility). Preventive medical visits dropped as people delayed medical care and ended up in emergency rooms. (President Obama’s health care law expands coverage, but only gradually.) The $85 billion “sequester” that began on March 1 will cut nutrition subsidies for approximately 600,000 pregnant women, newborns and infants by year’s end. Public housing budgets will be cut by nearly $2 billion this year, even while 1.4 million homes are in foreclosure. Even the budget of the Centers for Disease Control and Prevention, the nation’s main defense against epidemics like last year’s fungal meningitis outbreak, is being cut, by at least $18 million. To test our hypothesis that austerity is deadly, we’ve analyzed data from other regions and eras. After the Soviet Union dissolved, in 1991, Russia’s economy collapsed. Poverty soared and life expectancy dropped, particularly among young, working-age men. But this did not occur everywhere in the former Soviet sphere. Russia, Kazakhstan and the Baltic States (Estonia, Latvia and Lithuania) — which adopted economic “shock therapy” programs advocated by economists like Jeffrey D. Sachs and Lawrence H. Summers — experienced the worst rises in suicides, heart attacks and alcohol-related deaths.

Police protect bank from graffiti artists

Countries like Belarus, Poland and Slovenia took a different, gradualist approach, advocated by economists like Joseph E. Stiglitz and the former Soviet leader Mikhail S. Gorbachev. These countries privatized their state-controlled economies in stages and saw much better health outcomes than nearby countries that opted for mass privatizations and layoffs, which caused severe economic and social disruptions. Like the fall of the Soviet Union, the 1997 Asian financial crisis offers case studies — in effect, a natural experiment — worth examining. Thailand and Indonesia, which submitted to harsh austerity plans imposed by the I.M.F., experienced mass hunger and sharp increases in deaths from infectious disease, while Malaysia, which resisted the I.M.F.’s advice, maintained the health of its citizens. In 2012, the I.M.F. formally apologized for its handling of the crisis, estimating that the damage from its recommendations may have been three times greater than previously assumed. America’s experience of the Depression is also instructive. During the Depression, mortality rates in the United States fell by about 10 percent. The suicide rate actually soared between 1929, when the stock market crashed, and 1932, when Franklin D. Roosevelt was elected president. But the increase in suicides was more than offset by the “epidemiological transition” — improvements in hygiene that reduced deaths from infectious diseases like tuberculosis, pneumonia and influenza — and by a sharp drop in fatal traffic accidents, as Americans could not afford to drive. Comparing historical data across states, we estimate that every $100 in New Deal spending per capita was associated with a decline in pneumonia deaths of 18 per 100,000 people; a reduction in infant deaths of 18 per 1,000 live births; and a drop in suicides of 4 per 100,000 people. Our research suggests that investing $1 in public health programs can yield as much as $3 in economic growth. Public health investment not only saves lives in a recession, but can help spur economic recovery. These findings suggest that three principles should guide responses to economic crises. First, do no harm: if austerity were tested like a medication in a clinical trial, it would have been stopped long ago, given its deadly side effects. Each nation should establish a nonpartisan, independent Office of Health Responsibility, staffed by epidemiologists and economists, to evaluate the health effects of fiscal and monetary policies. Second, treat joblessness like the pandemic it is. Unemployment is a leading cause of depression, anxiety, alcoholism and suicidal thinking. Politicians in Finland and Sweden helped prevent depression and suicides during recessions by investing in “active labor-market programs” that targeted the newly unemployed and helped them find jobs quickly, with net economic benefits. Finally, expand investments in public health when times are bad. The cliché that an ounce of prevention is worth a pound of cure happens to be true. It is far more expensive to control an epidemic than to prevent one. New York City spent $1 billion in the mid-1990s to control an outbreak of drug-resistant tuberculosis. The drug-resistant strain resulted from the city’s failure to ensure that low-income tuberculosis patients completed their regimen of inexpensive generic medications. One need not be an economic ideologue — we certainly aren’t — to recognize that the price of austerity can be calculated in human lives. We are not exonerating poor policy decisions of the past or calling for universal debt forgiveness. It’s up to policy makers in America and Europe to figure out the right mix of fiscal and monetary policy. What we have found is that austerity — severe, immediate, indiscriminate cuts to social and health spending — is not only self-defeating, but fatal.

The Excel coding error

Who Is Defending Austerity Now?
The Excel error heard ’round the world has deficit-cutters backpedaling
by Matthew O’Brien  /  2013-04-22

Austerians have had their worst week since the last time GDP numbers came out for a country that’s tried austerity. But this time is, well, different. It’s not “just” that southern Europe is stuck in a depression and Britain is stuck in a no-growth trap. It’s that the very intellectual foundations of austerity are unraveling. In other words, economists are finding out that austerity doesn’t work in practice or in theory. What a difference an Excel coding error makes. Austerity has been a policy in search of a justification ever since it began in 2010. Back then, policymakers decided it was time for policy to go back to “normal” even though the economy hadn’t, because deficits just felt too big. The only thing they needed was a theory telling them why what they were doing made sense. Of course, this wasn’t easy when unemployment was still high, and interest rates couldn’t go any lower. Alberto Alesina and Silvia Ardagna took the first stab at it, arguing that reducing deficits would increase confidence and growth in the short-run. But this had the defect of being demonstrably untrue (in addition to being based off a naïve reading of the data). Countries that tried to aggressively cut their deficits amidst their slumps didn’t recover; they fell into even deeper slumps.

Enter Carmen Reinhart and Ken Rogoff. They gave austerity a new raison d’être by shifting the debate from the short-to-the-long-run. Reinhart and Rogoff acknowledged austerity would hurt today, but said it would help tomorrow — if it keeps governments from racking up debt of 90 percent of GDP, at which point growth supposedly slows dramatically. Now, this result was never more than just a correlation — slow growth more likely causes high debt than the reverse — but that didn’t stop policymakers from imputing totemic significance to it. That is, it became a “fact” that everybody who mattered knew was true. Except it wasn’t. Reinhart and Rogoff goofed. They accidentally excluded some data in one case, and used some wrong data in another; the former because of an Excel snafu. If you correct for these very basic errors, their correlation gets even weaker, and the growth tipping point at 90 percent of GDP disappears. In other words, there’s no there there anymore. Austerity is back to being a policy without a justification. Not only that, but, as Paul Krugman points out, Reinhart and Rogoff’s spreadsheet misadventure has been a kind of the-austerians-have-no-clothes moment. It’s been enough that even some rather unusual suspects have turned against cutting deficits now. For one, Stanford professor John Taylor claims L’affaire Excel is why the G20, the birthplace of the global austerity movement in 2010, was more muted on fiscal targets recently.

The discovery of errors in the Reinhart-Rogoff paper on the growth-debt nexus is already impacting policy. A participant in last Friday’s G20 meetings told me that the error was a factor in the decision to omit specific deficit or debt-to-GDP targets in the G20 communique.

For another, Bill Gross, the manager of the world’s largest bond fund, and who, as Joseph Cotterill of FT Alphaville points out, used to be quite the fan of British austerity, made a big about-face in an interview with the Financial Times on Monday:

The UK and almost all of Europe have erred in terms of believing that austerity, fiscal austerity in the short term, is the way to produce real growth. It is not. You’ve got to spend money.Bond investors want growth much like equity investors, and to the extent that too much austerity leads to recession or stagnation then credit spreads widen out — even if a country can print its own currency and write its own checks. In the long term it is important to be fiscal and austere. It is important to have a relatively average or low rate of debt to GDP. The question in terms of the long term and the short term is how quickly to do it.


Growth vigilantes are the new bond vigilantes. Gross thinks the boom, not the slump, is the time for austerity — which sounds an awful lot like you-know-who. The austerity fever has even broken in Europe. At least a bit. Now, eurocrats can’t say that austerity has been anything other than the best of all economic policies, but they can loosen the fiscal noose. And that’s what they might be doing, by giving countries more time and latitude to hit their deficit targets. Here’s how European Commission president José Manuel Barroso framed the issue on Monday:

While [austerity] is fundamentally right, I think it has reached its limits in many aspects. A policy to be successful not only has to be properly designed. It has to have the minimum of political and social support.

That’s not much, but it’s still much better than the growth-through-austerity plan Eurogroup president Jeroen Dijsselbloem was peddling on … Saturday. Now, Reinhart and Rogoff’s Excel imbroglio hasn’t exactly set off a new Keynesian moment. Governments aren’t going to suddenly take advantage of zero interest rates to start spending more to put people back to work. Stimulus is still a four-letter word. Indeed, the euro zone, Britain, and, to a lesser extent, the United States, are still focussed on reducing deficits above all else. But there’s a greater recognition that trying to cut deficits isn’t enough to cut debt burdens. You need growth too. In other words, people are remembering that there’s a denominator in the debt-to-GDP ratio. But austerity doesn’t just have a math problem. It has an image problem too. Just a week ago, Reinhart and Rogoff’s work was the one commandment of austerity: Thou shall not run up debt in excess of 90 percent of GDP. Wisdom didn’t get more conventional. What did this matter? Well, as Keynes famously observed, it’s better for reputation to fail conventionally than to succeed unconventionally. In other words, elites were happy to pursue obviously failed policies as long as they were the right failed policies. But now austerity doesn’t look so conventional. It looks like the punchline of a bad joke about Excel destroying the global economy. Maybe, just maybe, that will be enough to free us from some defunct economics.





The “Hard Times”, strictly speaking, referred to the “recession” of 1837-1838, when 90% of the factories and the United States closed following a banking crisis which was credited to Andrew Jackson. At the heart of this period, these large cent sized tokens became necessary substitutes for the government issued coins, which were to a large extent hoarded. This rich and varied series has achieved a substantial following, with some pieces commanding thousands of dollars. The series includes  politically oriented tokens, commercial advertising tokens, and anonymous monetary tokens. Perhaps the most enduring result of this series is emergence  of the donkey as the symbol of the Democratic Party.”


“The event that defines this era was the veto of the renewal of the charter of the Bank of the United States by Andrew Jackson in 1832. The BUS was slated to close in 1836, but Jackson didn’t wait. He withdrew Treasury money from the BUS. (Interestingly, the Treasury had an embarrassment of riches. The US was without debt.) However, when the BUS closed, credit collapsed. “I take the responsibilty”, says Andrew Jackson, standing in an empty treasure chest. Martin Van Buren’s ship of state has tattered sails on the obverse of a coin; the reverse shows Henry Clay’s sails billowing. “I follow in the steps of my illustrious predecessor”, says the jackass on the obverse while the reverse shows a treasure chest being borne off by a turtle. “Good for shinplasters” refers to worthless paper money used as stuffing in boots. Some, to avoid charges of counterfeiting bear the slogan “Millions for defense NOT ONE CENT for tribute.”

These tokens were about the size of a US Large Cent, just under 3 cm across, hefting over 10 grams. They were an east coast phenomenon, since metals, dies, etc., were found near industry. (Twenty five years later, Civil War tokens were issued from Michigan, Indiana, etc.) The fact that they are found today in middle grades around Fine indicates that they actually circulated in trade. America eventually recovered from the Panic of 1837. The debt rose. Finances moved from Chestnut Street in Philadelphia to Wall Street in New York. Hard times tokens retired to dressers and chests as government cents (soon smaller) circulated again.”

Hard Times Token, 1834


“One of the more interesting aspects of American numismatics is the study of those tokens which served in place of coins. The best known of these were made in the late 1830s and today are called Hard Times Tokens because of the economic problems that affected the United States during that era. Prior to 1837 tokens were little used in the American marketplace but a series of events that began in 1834 was to change everything. In that year, after many years of debate, Congress finally reformed the gold coinage by lowering the weights. During the 1820s most coined gold had left the United States, leaving only silver and bank notes to conduct commercial affairs.

The act of June 1834 was meant to bring United States gold coins into line with the international ratio between gold and silver. The law of 1792 had set the ratio at 15 to 1 (i.e. one ounce of gold was worth 15 ounces of silver) but by the 1820s the world markets used ratios closer to 16 to 1. The result of the 1834 law was that gold flowed heavily into the United States because the ratio had been set a little too high, at 16 to 1. During 1835 and 1836 Mint and Treasury officials became concerned that the influx of gold was having the unwanted effect of driving out the silver coinage of the United States; foreign silver still arrived in considerable quantities, however. To solve this latest problem, Mint Director Robert M. Patterson prepared a comprehensive coinage bill that included a provision that slightly lowered the ratio, to about 15.9 to 1. The revised law was passed in January 1837 and proved beneficial. U.S. silver stopped leaving the country while gold continued to arrive.

During early 1837 the United States was perhaps the best supplied with gold and silver coins than had ever been the case in its history up to that time. But all of this would soon end, due to a series of blunders made by the states, as well as the federal government. The early 1830s witnessed a great expansion of business and with this came a call for roads and canals so that goods could be gotten to market and raw materials brought from the interior to the coastal manufacturing plants. All of this initiated massive borrowing by the states for these internal improvements. This spending created inflation and increased issues of paper money. The expansion of the roads and canals played out against another backdrop, the attack by President Andrew Jackson on the Bank of the United States. This bank, which had been chartered in 1816 for 20 years, served the nation well in forcing private banks to honor their paper currency with specie, usually silver but after 1834 in gold if desired.

The strong position of the Bank of the United States, however, inevitably led to political involvement and the bank leadership was openly against the Jackson Administration. This President felt the same about the bank and was determined to destroy it. The early 1830s saw a bitter struggle between the bank and Jackson. The bank lost. One of the strategies used by the President to undermine the bank was the removal of federal deposits (gold and silver coin). Such funds were placed in private banks friendly to the administration, called “pet banks” by Jackson’s enemies. These banks were sometimes poorly managed and the influx of hard money led them to issue loans to politically connected individuals without the proper collateral.

Hard Times Token, 1834

The federal government had also stepped in to make matters worse, much worse. Jackson had long felt that paper money, in particular that was issued by private banks, was holding back the economic expansion of the United States; the President believed that bank notes of less than $20 in value ought not to be issued. The problem with this was that was a large number of notes of less than $5 value in daily use, an unintended result of the monies going to pet banks. The disaster waiting to happen was politely termed the Specie Circular and had been issued by Treasury Secretary Levi Woodbury on July 11, 1836. It required that land purchases on the frontier be made strictly in gold or silver coin. Some exceptions were made for the use of paper money on a temporary basis but the intent was clearly to force paper money out of daily use. At the same time, the massive influx of gold into the United States from 1834 through 1836 caused problems in Europe, especially England. The Bank of England responded to the loss of gold by raising the discount rate to 5 percent in September 1836. This caused a reverse flow of gold to Great Britain although on a limited basis at first. By the spring of 1837 gold was leaving for England at a growing rate. The cumulative effect of the Specie Circular, funds to pet banks, and the English discount rate came crashing down in May 1837. On May 10 the New York banks suspended specie payments for their notes, triggering a run on banks throughout the United States. The financial upheaval forced many businesses to fail and a large number of workmen were laid off. The Panic of 1837, as it came to be known, was a severe recession but not a depression. Gold and silver were now rarely used in commerce, their place being taken by bank notes as well as scrip for values as low as a few cents. The government had meant well but failed to foresee what would happen by acting too quickly.

As in all such situations a number of people saw the opportunity not only to make money, but score political points against their enemies at the same time. It is hard to say which aim was the most important. The token coinage which resulted succeeded very well in both aims, much to the irritation of the supporters of Andrew Jackson and his hand-picked successor, Martin Van Buren. Van Buren had taken the oath of office as President on March 4, 1837, just in time to reap the whirlwind caused by the earlier mistakes. On the eve of the token explosion in 1837 the United States Mint had no idea of what would happen. But it did have a vested interest in seeing to it that the tokens were neither issued nor used in the marketplace. The reason was purely economic in that the Mint derived a considerable profit from issuing copper coins to the public. There was, however, a difficult problem that the Mint had in dealing with the token outbreak. Copper coins were not legal tender and not convertible into gold or silver except at the so-called exchanges, where copper cents could be converted to silver for a fee of several percent. Merchants had to pay their bills in specie (until the banks suspended specie payments) so the accumulation of United States copper coins was not exactly a blessing. (Legal tender status was not given to minor coins until 1864.) Just when the first Hard Times Tokens began to be seen in the marketplace is uncertain, but distribution of these pieces was well under way by the summer of 1837, perhaps as early as mid July. They apparently first appeared in New York City but this is also not quite certain and is based on the fact that more varieties of tokens are known for this area.

Whatever the exact sequence of events, they were unknown to Mint Director Robert M. Patterson until the fall of 1837. He noticed, in a local newspaper, an advertisement offering tokens for sale at a price well under the official value of a cent. Considering that the Mint needed the profit on copper coinage to offset other expenses, he was less than pleased at what he saw. Dr. Patterson sent a Mint employee to purchase a few of the tokens that had been advertised and then visited the United States district attorney, whose name was Reed. Patterson told Mr. Reed that the tokens in question were “spurious” and that the 1825 anti-counterfeiting statute was applicable in this case. Patterson testified before a federal grand jury and that body agreed with him; federal officials now ordered the local merchant to stop selling tokens on pain of prosecution. At first the Mint director believed that the token episode was an isolated one. However, he soon learned that he had witnessed but a small part of the business and that it was widespread throughout New England and New York State. Patterson then began writing letters to friends asking them to investigate the matter and report back to him. By late November Patterson had learned how much of a nuisance the tokens had become, at least in his mind. On Dec. 2, 1837, he wrote Treasury Secretary Woodbury on what he viewed as a worsening situation as the Mint’s profits on copper coinage were being eroded. Patterson began his letter by recounting the incident with the Philadelphia merchant and the grand jury. Patterson noted that similar problems were encountered at Baltimore but that the major problem was in New York City where the tokens were not only manufactured but used widely in ordinary business transactions. One friend of the director’s in New York had picked up 10 different kinds of tokens and sent them to the Mint for examination. The Mint director found that at least three of the tokens had been made at the same private mint because the design was similar. In particular Patterson mentioned the following tokens (or “store cards” as we might term them now): New York Joint Stock Exchange Company, Robinson, Jones & Company, and Ezra Sweet. He went on to note that a newspaper, the New York Observer, was reporting numerous kinds of such pieces in daily use throughout the city. According to the newspaper account, the tokens were sold for about 62 cents per hundred pieces, a nice profit when passed on for a cent.

“In its issue of Nov. 23, 1837, the Emancipator ran an advertisement offering the Female Slave tokens at $1 per hundred. Made of good copper and with a device on reverse similar to legal U.S. cents, they sold well. The ad also said that it was proposed to issue Kneeling Male Slave tokens as well, and this accounts for the few pattern pieces of HT 82, which were never produced for circulation.”

According to Patterson, an anti-slavery newspaper, the Emancipator, reported that pieces similar to a cent of a “new emission” were being sold at the offices of the Anti-Slavery League on Nassau Street. The paper described the devices as being anti-slavery in nature. There is one anti-slavery token listed by Lyman Low (No. 54), in his study of Hard Times Tokens, which seems to fit the given conditions except that it is dated 1838. Perhaps the issuers felt that it would be coming out so late in 1837 that it ought to be given the next year’s date. The listing made by Dr. Patterson show another interesting aspect of the Hard Times Tokens in general. The date, if prior to 1837, may well mean nothing more than some important year connected with the business that issued them. The Robinson, Jones, & Company piece, for example, uses an 1833 date to show that it received a medal that year for a button display. Patterson also noted that tokens were well used in Boston though he did not give any names. The Boston tokens, as with most of the others, were lightweight compared to the genuine cent, averaging perhaps 70 percent of the weight. He thought that manufacturing costs were about 50 cents, or a bit more, for a hundred pieces which gave a decent profit when they were later sold at about 62 cents per hundred. The dies were crude and cheaply made, which helped hold costs down. Not only did the merchants get “cents” at a strong discount but most of these tokens had the added advantage of advertising their businesses. As far as they were concerned it was a win-win situation. Dr. Patterson, however, had a slightly different opinion.

In the meantime Treasury Secretary Woodbury had taken Patterson’s letter under consideration. On Dec. 4 he replied, noting that he had just written the federal attorneys at New York and Baltimore; he did not mention Boston but this was probably done as well. The attorneys were instructed to take such steps as to eradicate the problem. December 6 saw Patterson writing Woodbury again, this time to report that he had seen another 11 tokens, primarily from New York. His list included token issuers Henry Anderson, H. Crossman, Maycock & Company, Merchants Exchange, and Abraham Riker. These later tokens were somewhat heavier, though still light by as much as 32 grains below the legal standard of 168 grains. In an 1849 letter discussing these tokens Dr. Patterson mentioned that the legal attacks by federal attorneys had put a stop to the business. It is not clear from the letter, however, if the political tokens were interdicted by the same methods since no names appeared on these as issuers. It is believed that very few merchant tokens were struck after the spring of 1838. At the same time as the merchant pieces were issued, political opportunists saw the chance to not only attack Presidents Jackson and Van Buren but make a tidy profit in the process. Quite a few varieties of the political tokens were issued and are collected today by specialists.

It is of interest to note that the tokens of 1837-1838 are known as Hard Times Tokens, but this is a little less than accurate. The recession that started in May 1837 was essentially over within a year; New York banks resumed specie payments in May 1838. In June 1839, however, matters suddenly got worse and this time it was a full-blown depression with large numbers thrown out of work. The underlying cause of this second round of economic bad news was primarily the English discount rate, as too much gold had again left the island kingdom. This time the problem lasted until 1842, when important discoveries of gold in Russian Siberia provided massive quantities of the yellow metal for world markets. Hard Times Tokens are but a footnote in the numismatic history of the United States yet played a key role in the marketplace for a few months. They deserve to be better known.”


“Hard Times tokens represent an unusual period in the financial history of the United States. President Jackson, in his campaign of 1832, was vehemently opposed to the Second Bank of the United States. This central bank in Philadelphia was said by opponents to control the money supply in favor of the wealthy merchants. Populist Jackson vowed to abolish it. The bank issued its own currency, which quickly became the most stable paper money in the land. It exercised considerable control over credit and interest rates throughout the country. When Jackson was reelected, he tried to abolish the bank. Finally, in 1837 he succeeded in accomplishing his goal. In the meanwhile, the president of the bank, Nicholas Biddle, tightened the money supply, which then lead to a financial panic. Other banks issued paper money with little or no gold or silver backing and quickly folded. By 1837 over 100 banks had gone under. The small change necessary for commerce began to disappear. Tokens were issued to solve the needs of the public. They were frequently political or satirical in nature. The tokens of the period 1832-1844, when Van Buren became president, are classified as the Hard Time issues.”


Mint Drop Token, 1837

“”Bentonian Currency” was hard money as opposed to paper. The crash of 1837 and the Hard Times which followed were by no means solely due as the Wing leaders would have it believed–to the overthrow of their policy and the “mint drops” or hard money of Jackson and Van Buren: they were only the culmination of evils which had long been threatening disaster.  The Panic of 1837 resulted in hoarding of coins in circulation. The withdrawal of public funds from the banks led to a contraction of the currency and great changes in apparent values, which were the apparent causes of “Hard Times.” To fill the need for small change in circulation, a wide variety of copper tokens appeared in 1837.”

Illustrious Predecessor Token, 1837


“Because Van Buren was a supporter of Jackson — going so far as to state his intent to follow in Jackson’s footprints during his inaguration — Van Buren was a solid target for people’s resentment due to the failing economy.   The Hard Times tokens were minted in cheap copper and bronze blends by private businesses and infividuals, and enthusiastically decorated with political satire of all kinds.   Van Buren’s face didn’t adorn many (if any)  of these tokens, although caricatures of Jackson were quite common.   Mostly, Van Buren was mentioned as Bad Things To Come, represented by things such as the ship “Experiment” seen to the left, breaking up in stormy seas, representing the attempt to do without banks, despite the lack of previous evidence that it works.    Van Buren’s inauguration statement, “I follow in the footsteps of my illustrious predecessor” stuck with him — but were combined with a picture of a jackass to show just what his opponents thought of him.   That donkey, originally used as a visual ersatz Andrew Jackson, eventually became the way the public saw the Democrat party, and was revised to be a donkey for today’s Democrat logo.  These Hard Times Tokens were some of the first lasting representations of the Democrats as a donkey.

These tokens weren’t exactly currency, although some businesses accepted them in lieu of actual monies, seeing that due to the bank’s actions and Jackson’s opposition to federal currency these Hard Times tokens had just about as much monetary value as the so-called ‘real thing’.  Mostly, they were passed around like political buttons today, demonstrating political affiliation and making a statement against the government at the time.”

hard times

“I take the responsibility,” says Andrew Jackson, standing in an empty treasure chest. Martin Van Buren’s ship of state has tattered sails on the obverse of a coin; the reverse shows Henry Clay’s sails billowing. “I follow in the steps of my illustrious predecessor,” says the jackass on the obverse while the reverse shows a treasure chest being borne off by a turtle. “Good for shinplasters” refers to worthless paper money used as stuffing in boots. Many, to avoid charges of counterfeiting, bear the slogan “Millions for defense NOT ONE CENT for tribute.” In 1834, an economic downturn on the English stock market brought “hard times” to both Canada and the United States. However, the event that defines the start of this era in the USA was a clash between the Bank of the United States and President Andrew Jackson in 1832.

The BUS was a semi-private institution, the invention of Alexander Hamilton, and precursor to the Federal Reserve. It was slated for renewal in 1836, but Jackson didn’t wait. He withdrew US Treasury money from the BUS and deposited it in local banks. Interestingly, the Treasury had an embarrassment of riches, about $17 million in surplus gold and silver. Also, the US government was without debt. However, when the BUS closed, credit collapsed. Political activists and merchants created these 1-cent tokens to take up the slack. They were an East Coast phenomenon, since metals, dies, etc., required industry. (Twenty five years later, Civil War tokens were issued from Michigan, Illinois and Wisconsin in the West.) The fact that most types of Hard Times Token can be found today in grades from Fine down to Good indicates that they actually circulated in trade.

The standard reference manual for this series is Hard Times Tokens 1832-1834 by Russell Rulau. His work is based on a book from the 1899 by Lyman H. Low. Rulau includes the Low numbers in his catalog. He estimates retail price. He has added many new items over the years with each new addition. The book also approximates the rarity, R1 (common) to R8 (perhaps unique). Some of these coins are objectively rare and highly valued outside the world of numismatics. “Am I not a Woman” is the motto on an Abolitionist token. “Am I not a Man” is its companion piece. Professional Afro-Americans and full- time liberals have bid these up to about $80 in better grade and perhaps over $10,000 in uncirculated. These two are difficult to find in low grade because they have been popular with collectors for over 150 years.

You can find common Hard Times Tokens in almost any dealer’s inventory. You will find them priced all over the range depending on the dealer’s willingness to own them. You will have to use basic numismatic principles to grade them. Although they rate a general entry in The Red Book, not all services will slab them. Commons in low grade are no more than a $5 item, or about $15 below uncirculated. America eventually recovered from the Panic of 1837. The Federal Debt rose. Finances moved from Chestnut Street in Philadelphia to Wall Street in New York. Hard Times Tokens retired to dressers and chests as government cents (soon smaller) circulated again. If you really love American History and really treasure the values that define our nation, you will find a wealth of pride in these artifacts.

Matthew Hincman coins.
“Pomme de Terre, Pomme en l’Air.” Coins by Matthew Hincman

Coins For Hard Times: Artist Makes His Own Money
by David Kestenbaum /  October 05, 2009

I ran into artist Matthew Hincman last week, who has decided that things have gotten bad enough that it’s time to create your own money. Hincman designed the coin above and had 1,200 minted in copper, which he plans to leave on the ground at random locations for people to pick up and puzzle over. He says he modeled the coins after the Hard Times Tokens that circulated in the 1800’s, many of them satirical. Hincman has no plans to control the money supply at large. In fact, he’s trying to stay out of trouble. For one recent project, he installed an unusual version of the standard park bench — it was impounded by the authorities, though they liked it so much, it’s now back in place. Hincman figures there’s no law against leaving coins around. He says sometimes drops to one knee and pretends to be tying his shoe, then casually deposits one on the sidewalk.


Glow-in-the-Dark to prevent ‘counterfeiting’

Open Call for Entries: “The ‘producers’ of the International Drink Ticket herein announce a design contest for proposals to replace the current ‘Spanglish’ face of the Ticket, not pictured. The winning designer will get a small share (percentage) of any future known-universe profit. The winning design will be used to create the mold that is used to emboss one side of the IDT (the ‘Chinglish’ side will remain the same). The ‘Spanglish’ side may include a picture, as on a coin, such as Obama or a bird etc, but at minimum must include (English or Spanish) impressions that read “Int’l Drink Ticket” + “Brooklyn Mint” + the current year: “2009″.”

“The International Drink Ticket, printed in editions, is a currency alternative sincerely offered to replace the collapsed dollar (should the US dollar irrevocably fail). All over the world, even if one abstains, everyone knows someone who drinks: one International Drink Ticket is worth one drink, that is to say, one 4-count pour (using a pour spigot) of bottom shelf liquor (non-well), or a bottled beer. Everything else is negotiable. The IDT also easily gets used as barter coin. Common bartender uses are 2 cans of beer for one IDT, or sometimes two 2-count shots. One IDT currently is worth around USD$5 in NYC but this value fluctuates regionally. Design entries should be big enough to 3-D print, and fully detailed.” [Please post proposal editions below as comments.]

BY Pamela G. Parker
design by Pamela G. Parker