http://givingpledge.org/
http://onlythesuperrich.org/
EARLIEST-KNOWN RICH PEOPLE
http://dwardmac.pitzer.edu/Anarchist_Archives/kropotkin/mutaidch3.html
by Peter Kropotkin / 1902
“Eskimo life is based upon communism. What is obtained by hunting and fishing belongs to the clan. But in several tribes, especially in the West, under the influence of the Danes, private property penetrates into their institutions. However, they have an original means for obviating the inconveniences arising from a personal accumulation of wealth which would soon destroy their tribal unity. When a man has grown rich, he convokes the folk of his clan to a great festival, and, after much eating, distributes among them all his fortune. On the Yukon river, Dall saw an Aleonte family distributing in this way ten guns, ten full fur dresses, 200 strings of beads, numerous blankets, ten wolf furs, 200 beavers, and 500 zibelines. After that they took off their festival dresses, gave them away, and, putting on old ragged furs, addressed a few words to their kinsfolk, saying that though they are now poorer than any one of them, they have won their friendship. Like distributions of wealth appear to be a regular habit with the Eskimos, and to take place at a certain season, after an exhibition of all that has been obtained during the year.(30) In my opinion these distributions reveal a very old institution, contemporaneous with the first apparition of personal wealth; they must have been a means for re-establishing equality among the members of the clan, after it had been disturbed by the enrichment of the few. The periodical redistribution of land and the periodical abandonment of all debts which took place in historical times with so many different races (Semites, Aryans, etc.), must have been a survival of that old custom. And the habit of either burying with the dead, or destroying upon his grave, all that belonged to him personally — a habit which we find among all primitive races — must have had the same origin. In fact, while everything that belongs personally to the dead is burnt or broken upon his grave, nothing is destroyed of what belonged to him in common with the tribe, such as boats, or the communal implements of fishing. The destruction bears upon personal property alone. At a later epoch this habit becomes a religious ceremony. It receives a mystical interpretation, and is imposed by religion, when public opinion alone proves incapable of enforcing its general observance. And, finally, it is substituted by either burning simple models of the dead man’s property (as in China), or by simply carrying his property to the grave and taking it back to his house after the burial ceremony is over — a habit which still prevails with the Europeans as regards swords, crosses, and other marks of public distinction.(31)”
30. Dall saw it in Alaska, Jacobsen at Ignitok in the vicinity of the Bering Strait. Gilbert Sproat mentions it among the Vancouver indians; and Dr. Rink, who describes the periodical exhibitions just mentioned, adds: “The principal use of the accumulation of personal wealth is for periodically distributing it.” He also mentions (loc. cit. p. 31) “the destruction of property for the same purpose,’ (of maintaining equality).
31. In a remarkable work, The Religious Systems of China, published in 1892-97 by J. M. de Groot at Leyden, we find the confirmation of this idea. There was in China (as elsewhere) a time when all personal belongings of a dead person were destroyed on his tomb — his mobiliary goods, his chattels, his slaves, and even friends and vassals, and of course his widow. It required a strong reaction against this custom on behalf of the moralists to put an end to it. With the gipsies in England the custom of destroying all chattels on the grave has survived up to the present day. All the personal property of the gipsy queen who died a few years ago was destroyed on her grave. Several newspapers mentioned it at that time.
AND NOW
http://www.guardian.co.uk/world/2010/jul/12/bill-and-melinda-gates-foundation
Inside the Bill and Melinda Gates Foundation
by Andy Beckett / 12 July 2010
The headquarters of probably the most powerful charity in the world, and one of the most quietly influential international organisations of any sort, currently stand between a derelict restaurant and a row of elderly car repair businesses. Gentrification has yet to fully colonise this section of the Seattle waterfront, and even the actual premises of the Bill & Melinda Gates Foundation, which, appropriately perhaps, used to be a cheque-processing plant, retain a certain workaday drabness. Only four storeys high, with long rows of windows but no hint of corporate gloss, its beige and grey box sits anonymously in the drizzly northern Pacific light.
There is no sign outside the building. There is not even an entrance from the street. Instead, visitors must take a side road, stop at a separate gatehouse, also unmarked, and introduce themselves to a security guard, of the eerily polite and low-key kind employed by ex-heads of state and the extremely rich. Once admitted, you cross a car park full of modest vehicles and, if you are lucky, glimpse one of the world-renowned health or poverty specialists working for the foundation, dressed in the confidently casual Seattle office uniform of chinos and rainproofs. Then you reach the reception: finally, there is a small foundation logo on the wall, and beside it a few lyrical photographs of children and farmers in much dustier and less prosperous places than Seattle. Only past the reception, almost hidden away on a landing, is there a reminder of the foundation’s status and contacts: a vivid shirt in a glass case, presented during a visit by Nelson Mandela.
The low-lit corridors beyond have little of the scruffiness and bustle you often find at charities. The fast- expanding foundation staff (presently around 850 employees) are in increasing demand around the world: meeting governments, attending summits and conferences, and above all “in the field”, as foundation people put it, checking on the progress of the hundreds of projects – from drought-tolerant seeds to malaria vaccines to telephone banking for the developing world – to which the organisation has given grants since it was founded in 1994.
In Seattle, maps of Africa and southern Asia, the foundation’s main areas of activity outside America, are pinned up in the often empty, sparsely decorated offices and cubicles. There are also cuttings about the foundation’s work from the Economist and the Wall Street Journal, not publications you might have previously associated with a big interest in global disease and poverty. And lying on the foundation’s standard-issue, utilitarian desks, there are its confidently written and comprehensively illustrated reports: Ghana: An Overall Success Story is the title of one left in the unoccupied office I have been lent between interviews. The foundation, in short, feels like a combination of a leftish thinktank, an elite management consultancy and a hastily expanding internet start-up. Is it the sort of institution that can really help the world’s poorest people?
For 14 of the last 16 years Bill Gates has been the richest person on earth. More than a decade ago, he decided to start handing over the “large majority” of his wealth – currently £36bn – for the foundation to distribute, so that “the people with the most urgent needs and the fewest champions” in the world, as he and his wife Melinda put it on the foundation website, “grow up healthier, get a better education, and gain the power to lift themselves out of poverty”. In 2006, Warren Buffett, currently the third richest person in the world, announced that he too would give a large proportion of his assets to the foundation. Its latest accounts show an endowment of £24bn, making it the world’s largest private foundation. It is committed to spending the entire endowment within 50 years of Bill and Melinda Gates’s deaths. Last year it awarded grants totalling £2bn.
As well as its money, it is the organisation’s optimism and the fame of its main funder – in 2008 Bill Gates stopped working full-time for his computer giant Microsoft to concentrate on the foundation – that has given it momentum. Last May an editorial in the revered medical journal the Lancet praised it for giving “a massive boost to global health funding . . . The Foundation has challenged the world to think big and to be more ambitious about what can be done to save lives in low-income settings. The Foundation has added renewed dynamism, credibility, and attractiveness to global health [as a cause].”
Precise effects of big charity projects can be hard to measure, especially over a relatively short period. But already two bodies that the foundation funds heavily, the Global Alliance for Vaccines and Immunisation (Gavi) and the Global Fund to Fight HIV/Aids, Tuberculosis and Malaria, have, according to the foundation, delivered vaccines to more than 250 million children in poor countries and prevented more than an estimated five million deaths. “The foundation has brought a new vigour,” says Michael Edwards, a veteran charity commentator and usually a critic of billionaire philanthropists. “The charity sector can almost disempower itself; be too gloomy about things . . . Gates offers more of a positive story. He is a role model for other philanthropists, and he is the biggest.”
“Everyone follows the Gates foundation’s lead,” says someone at a longer-established charity who prefers not to be named. “It feels like they’re everywhere. Every conference I go to, they’re there. Every study that comes out, they’re part of. They have the ear of any [national] leadership they want to speak to. Politicians attach themselves to Gates to get PR. Everyone loves to have a meeting with Gates. No institution would refuse.”
The foundation has branch offices in Washington DC, Delhi and Beijing. This year, it opened an office in London, not in one of the scruffy inner suburbs usually inhabited by charities, but close to the Houses of Parliament. Seth Berkley, head of the International Aids Vaccine Initiative [IAVI], says: “The foundation has the advantage of speed and flexibility. When they want to, they can move quickly, unlike many other large bureaucracies. Most of the other private foundations in the US don’t work globally. Others are more staid than Gates. I used to work at the Rockefeller Foundation [an older American charity] and dole out grants in small amounts. The Gates foundation gave us at IAVI a grant of $1.5m (£1m), then $25m. Then they gave us a line of credit – which is extremely unusual in grant-making – of $100m, to give us assets to be able to negotiate with pharmaceutical companies and initiate vaccine development programmes. Using that $100m, we were able to leverage lots more funding – $800m in total. What Gates allowed us to do was go out and search for new ideas and move quickly on them. The old way was to find the new ideas, and then look for a donor to back them.”
Besides its dizzying grants, the foundation is also becoming a magnet for talented staff and collaborators. “We probably get more than our fair share of great external expertise and insight,” says chief executive Jeff Raikes. foundation staff can have a certain self-assurance. When the history of global health is written, says Katherine Kreiss, the foundation officer overseeing its nutrition projects, “the start of the Gates foundation will absolutely be a seminal moment.”
Some have reservations about this power and the use made of it. Mindful of the foundation’s ubiquity, few in the charity world are prepared to criticise it on the record. But last May the Lancet published two authored articles on the foundation. “Grant-making by the Gates foundation,” concluded one, “seems to be largely managed through an informal system of personal networks and relationships rather than by a more transparent process based on independent and technical peer review.” The other article found that, “The research funding of the Foundation is heavily weighted towards the development of new vaccines and drugs, much of it high risk and even if successful likely to take at least the 20 years which Gates has targeted for halving child mortality.”
In a forthcoming article for the Journal of Law, Medicine & Ethics, Devi Sridhar, a global health specialist at Oxford University, describes as a “particularly serious problem” the “loss of health workers from the public sector to better funded NGOs offering better remuneration”. She also suggests that the foundation, like other health organisations based in rich countries but active in the poor southern hemisphere, “[has] tended to fund . . . a large and costly global health bureaucracy and technocracy based in the north”. The foundation responds, “Much of our grant-making goes to large intermediary partners that in turn provide funding and support to those doing the work in the field, often to developing country institutions. We’re not able to provide a simple funding breakdown.”
The rise of the foundation has been part of a larger revival of interest in the west in the problems of poor countries. This phenomenon has encompassed increased government aid budgets, initiatives by the World Health Organisation and World Bank, celebrity-led events and campaigns such as Live8, image-conscious corporate schemes, and countless private ventures, from the sober and long-term to the reactive, adventure-seeking, self-styled “extreme humanitarianism” currently being practised in Haiti by freelance American volunteers and breathlessly described in the July issue of Vanity Fair.
It is hard to see this explosion of activity as a wholly bad thing. But it does have political implications. “It’s kind of [creating] a post-UN world,” says someone close to the Gates foundation. “People have gotten interested in fast results.” The UN, he says, is too slow and bureaucratic – you could say democratic – to achieve them. Critics of the new, more entrepreneurial aid industry such as the Dutch journalist Linda Polman, in her recent book War Games: The Story of Aid and War in Modern Times, see empire-building and wasteful competition as well as worthwhile altruism. “Everyone in global health is talking about poor coordination,” says someone at a charity in that field. “The Gates foundation is contributing to the fragmentation and duplication.”
And finally, a suspicion lingers, slowly fading but still there, that the foundation’s activities are some sort of penance for Gates’s world-dominating behaviour at Microsoft – or a continuation of that world domination by other means. Both Raikes and his predecessor as foundation chief executive were at Microsoft; Raikes from 1981 to 2008, during which time he was the company’s key figure after Gates and his co-founder Paul Allen.
As Raikes sits in his office, a little messy-haired, dressed in a zip-up jumper, fidgety in his chair and brisk in his answers, he even seems a bit like Gates. “There are some real cultural differences between the Gates foundation and Microsoft,” he says. “Some of that’s good and some of that’s not so good. The foundation is in a stage of . . . maturation. In philanthropy there is kind of a culture of, [he puts on a slightly airy-fairy voice] ‘You and I are here to help the world, and so we can’t disagree.’ At Microsoft, people would really throw themselves into the fray. I’m trying to encourage that here.” He offers only limited reassurance to those who consider the foundation too powerful. “We’re not replacing the UN,” he says. “But some people would say we’re a new form of multilateral organisation.” Is the foundation too ubiquitous? He smiles: “There are many people who want us to be much more involved than we want.”
Since Raikes joined the foundation in 2008, it has nevertheless broadened its activities. “Today we focus in on 25 key areas,” he says, then starts ticking them off on his strong fingers: “Eradicate polio. Reduce the burden of HIV, tuberculosis, diarrhoea, pneumonia . . .” Why not concentrate on fewer of these huge tasks? “You might say it’s a little bit of a business way of doing things. There are limits to the amount of money we could invest in any given area.” The foundation has so much money, it worries about saturating particular areas of need with grants and so achieving diminishing returns. Instead, says Raikes, “We think dollars-per-Daly in a big way.”
Daly is an acronym for disability-adjusted life year, an increasingly common term in international aid and global health circles, which measures the number of years of healthy life lost to either severe illness or disability or premature death in a given population. The idea that suffering and its alleviation can be measured with some precision is characteristic of the foundation’s technocratic, optimistic thinking.
The charity also seeks to maximise its impact though partnerships. It does not, for example, conduct medical research or distribute vaccines itself; instead it gives grants to those it considers the best specialists: “We think of ourselves as catalytic philanthropists,” says Raikes. He confirms the widely held view – sometimes meant as a criticism – that the health solutions the foundation favours are usually technical: “The foundation is really oriented towards the science and technology way of thinking. We’re not really the organisation that’s involved in bed-nets for malaria. We’re much more involved in finding a vaccine.”
The foundation’s health strategy is undergoing an internal review led by Girindre Beeharry, a pin-sharp youngish man from Mauritius who studied economics at the Sorbonne and Oxford. “The iconic story we all tell about the foundation,” he says, “is, if you had been in polio [medicine] 50 years ago, and your only instrument [to combat it] had been the iron lung, the orthodox approach would have been, ‘How can we distribute iron lungs to Africa?’ Or you could have spent some of your dollars on developing a polio vaccine – which is how we think.”
Ignacio Mas, another fast-talking foundation man who did his economics at Harvard and specialises in financial services for the poor, adds: “If you have this mindset of finding big solutions for big problems, that means technology, in practical terms. Because that is really the only thing that can transform.”
Aware of how much they have already changed the world through their businesses, computer tycoons can turn into impatient broader reformers. Google co-founder Sergey Brin is currently funding an attempt to revolutionise the search for a cure for Parkinson’s disease. Gates himself grew up in a charity-conscious environment: his mother Mary, a teacher, and his father William H Gates Sr, a well-connected Seattle lawyer, were both active in United Way, the international community service organisation. The Gates family were prosperous, and lived in the hushed and idyllic suburb of Laurelhurst, but Seattle is an outward-looking, conscientious place: in the 1980s Mary led a successful campaign to persuade the local university to withdraw its investments from apartheid South Africa.
By the early 90s, Bill Gates had started giving money to local schools and charities. But the donations were small compared to the billions he was earning, and he was too focused on Microsoft to pay much attention to the growing number of begging letters that his wealth and fame attracted. His parents, privately, and Seattle journalists, publicly, began to suggest that he should be more civic-minded. Then in 1995 he published The Road Ahead, a book he had co-written about the future of computing. Its sometimes bland corporate prose, Microsoft’s domineering reputation and Gates’ then unloved public persona meant that it received mixed reviews. Yet, read now, with the subsequent establishment of the foundation in mind, the book contains striking digressions about the world’s “sociological problems” and “the gap between the have and have-not nations”. There is a sense of Gates becoming curious about the world’s non-software needs and how he might help address them.
In 1994 he and his father had set up the William H Gates Foundation. Gates Sr ran it from his basement. Gates Jr wrote the cheques. The causes he backed were broader than before: birth control and reproductive health. As the foundation expanded, he and Melinda, whom he had met at Microsoft, both found themselves becoming more and more interested in the wider world. “I started to learn about poor countries and health, and got drawn in,” Gates told students at Berkeley during a speaking tour this spring. “I saw the childhood death statistics. I said, ‘Boy, is this terrible!'”
Gates has a tendency to talk about the horrors and injustices of the developing world just as he talks about the computer business: in blunt, jerky sentences, his nasal voice flat or leaping, his manner without much natural warmth or charm. He sounds like a clever man in a hurry thinking out loud – which is exactly what he is. Starting in the late 90s, he began to hungrily chew through the expert literature on global disease and nutrition and poverty. “He is a giant sponge,” says an epidemiologist who specialises in HIV. “I had dinner with him a couple of weeks ago. The man is extraordinary. I’ve been in the field 15 years, and his grasp of the technical details is just astounding. His weird brain allows him to ask questions.”
In 1999 the William H Gates Foundation and a separate charity Bill Gates had established to improve computer access in American libraries were combined into the Bill & Melinda Gates Foundation. As the couple grew richer through Microsoft, so they started making intermittent donations to a trust (from 2006 known as the Bill & Melinda Gates Foundation Asset Trust), which invested the accumulating assets. Those in turn were donated to the foundation. “In the early days, it was crazy,” says Katharine Kreiss, who joined from the American foreign service in 2002 and is refreshingly less on-message than more recent recruits. “There were so few people. I was one of 17 in total covering global health. By law in the US, as a charity you have to spend 5% of your endowment [annually], so you’re always trying to meet this number. When I started, we gave out about $1.5bn, the same as my department had when I was in the government, where I think we had a staff of 4,000! When I came into the foundation, I had 172 grants I was working on. There weren’t the people to do the rigour. We have so changed since. Now I have eight grants and I’m overwhelmed. I’m working on them in a much more detailed way.”
Gates is also much more involved. Since 2008 he and Melinda have begun regularly visiting the foreign projects it supports. “About 18 to 24 months in advance, they’re thinking about what trips they might go on,” says Kreiss. “Then they winnow down the options. We send them briefing notes. It’s really like working with a high-level principal in the government. If it’s your project, you will probably do an advance trip or two. Bill and Melinda will send their own advance team to look at logistics.” Then, sometimes accompanied by one or two carefully chosen journalists, the world’s second richest couple will visit homes or clinics in some of the world’s most blighted regions. “I’ve never actually been on a trip with Bill and Melinda,” says Kreiss. “But there are many questions. I’ve heard it’s just nonstop questions. There’s no downtime. Not a minute. It’s not a vacation.”
When they are back in Seattle, where they still live, Bill and Melinda have the use of offices at the foundation on a secluded top-floor corridor, along with Raikes and Gates Sr. This part of the building is slightly plusher – there are lights concealed in pillars and one corridor wall is an expensive-looking gold colour – but it is hardly palatial. Bill Gates’s status is denoted by something subtler: the sudden care with which foundation staff from Raikes downwards start choosing their words when the subject of Gates and his wife comes up. “The foundation is their vision, their mission,” says Kreiss. Roy Steiner, the foundation’s deputy director for agriculture (Harvard, ex-management consultant), tells me: “Bill has just recently spent a night in an Indian village. He slept in a hut in a village. How many chairs of philanthropic organisations have done that? But he’s a business guy – he wants to understand the customer.”
Outsiders who work with the foundation are sometimes less enthusiastic about his role. “Everyone in that organisation spends their whole time second-guessing what Bill will say,” says one. “They’ve got very smart people, but they’re always waiting for Bill.” According to the foundation, Gates and his wife “review” only its grants that exceed $50m, but Bill Gates’s influence can also be felt in much smaller foundation matters. Recently, an academic paper covering an area in which the charity is highly active, written by someone Gates knew quite well, was held back from him by foundation staff. “We can’t show it to him,” the author of the paper was told. “We think he won’t like it. The problem is the title of the paper. It includes a word Bill is allergic to.”
Yet sometimes, the author continues, Gates is more open-minded than his subordinates anticipate: “If you can capture his imagination, he will listen to any idea. He’s willing to say, ‘Let’s look at this.'” This year, alongside the foundation’s slick official website, a quirkier and more personal one began appearing called the Gates Notes, with sections called “What I’m Thinking About”, “What I’m Learning” and “My Travels”, and musings and recommendations on green technology, the financial crisis and the computer business as well as on the foundation’s existing activities. There is a sense of Gates, still only 54 and liberated from his round-the-clock Microsoft duties, constantly roaming beyond his charity’s already vast boundaries. The internet, the modern power of celebrity, and the ease of travel to virtually anywhere in the world enjoyed by the super-rich, has made it possible for the more thoughtful, socially conscious of them – such as Gates and the financier George Soros – to become autodidacts and philosopher-kings more potent even than the last generation of famous philanthropists, such as Andrew Carnegie and John D Rockefeller.
Last May, Gates, Soros, Buffett and David Rockefeller Jr, Rockefeller’s great-grandson, held a long private meeting in New York, not far from the UN, along with an assortment of media potentates such as Ted Turner, Oprah Winfrey and Michael Bloomberg. It was reported that Gates had been involved in summoning them all together; and that the Good Club, as it supposedly called itself, discussed the world’s economic, environmental and health problems, the dangers of over-population, and how rich people could better help poor people. The Sunday Times quoted an unnamed participant at the meeting, who said that without anything “as crude as a vote” the gathering had agreed that the world’s problems “need big-brain answers . . . independent of government”.
For the internet’s many Gates-watchers and conspiracy-spotters, it was all irresistibly sinister. Last month an apparently more benign explanation appeared. A friend of Gates and Buffett, Carol Loomis, wrote in the tycoon-watchers’ magazine Fortune that the gathering had been part of a behind-the-scenes campaign by the two men and Melinda Gates, which was now ready to go public, to persuade the rest of America’s billionaires to pledge at least 50% of their wealth to charity.
Like the Gates foundation, the initiative seems laudable and refreshing in many ways – especially given the discarding of any sense of social responsibility by so many of the rich in recent decades. Several of America’s wealthiest families have already signed the pledge. And yet, some authorities on philanthropy fear the consequences of this giving boom, and dislike the faint air of playing god that hangs over its creations such as the Gates foundation. Edwards says: “The world isn’t a giant experiment. The foundation affects real people in real places. Why should Bill decide which sort of vaccines get developed? “If you read the early reports of the Rockefeller and Carnegie foundations,” Edwards goes on, “those organisations have almost exactly the same character as the Gates foundation: top-down, technocratic, applying the language of engineering to social problems.” Edwards has worked in the charity sector since 1978, through good times and bad, and he also warns: “You can have boom and bust in this kind of ‘philanthrocapitalism’ as in capitalism itself.” Put crudely, the super-rich need to stay super-rich for their charitable enterprises to function.
The value of the Gates foundation’s endowment fell by a fifth during the 2008 banking crisis, although Raikes says the foundation did not cut its grant-making during the downturn, and its finances have recovered since. The ethical basis of the foundation’s finances has also been questioned. In 2007 an extensive investigation by the Los Angeles Times found that the charity, via its trust, invests in “companies that contribute to the human suffering in health, housing and social welfare that the foundation is trying to alleviate”. The foundation did not challenge the thrust of the articles, which included allegations that it invested in an oil company responsible for causing health problems by burning off its unwanted gas, in an African country in which the foundation was active in trying to improve the population’s health. But the charity decided after a brief review not to change its investment policy. Raikes’s predecessor Patty Stonesifer wrote to the newspaper: “The stories you told of people who are suffering touched us all. But it is naive to suggest that an individual stockholder can stop that suffering. Changes in our investment practices would have little or no impact on these issues.”
The Bill & Melinda Gates Foundation Asset Trust has always refused to invest in tobacco firms; otherwise, the outside investment managers the trust employs are instructed to seek the maximum return on its endowment, so that the foundation can be as generous as possible. It is a moral trade-off; but then uncomfortable compromises, like unequal power relationships, run through most charitable work. Many of the Gates foundation’s critics concede that the organisation is, as Edwards puts it, “closer to the best than the worst” on the spectrum of private charitable foundations: more expertly staffed, more focused on the problems where charity is most needed, more professional – there have so far been no obviously disastrous foundation-funded projects – and more prepared to change as it grows.
Raikes’s desire for fiercer internal debates at the foundation may be an acknowledgement that Gates needs to be challenged more. And Gates may want to be challenged more: according to Beeharry, among Gates’s current reading is a 2005 group biography of president Abraham Lincoln’s inner circle, Team of Rivals by Doris Kearns Goodwin: “Bill think it’s the best book ever written. It’s about how you embrace dissent as a leader.” Lincoln’s “genius”, writes Goodwin, was “to form friendships with men who had previously opposed him; to repair injured feelings; to assume responsibility for the failures of subordinates; to share credit with ease; and to learn from mistakes.” Given his driven decades at Microsoft, it may be hard to imagine Gates being interested in such softer qualities; but then it was hard not so long ago to imagine him giving most of his money away.
Next year the foundation is scheduled to move into new premises. Instead of the present hidden-away headquarters, and two even blander buildings it uses a mile away – staff have to shuttle between them all by minibus – the foundation will occupy a much showier hilltop “campus” in central Seattle, all dark glass and golden stone, with office blocks like ocean liners, space for at least twice the current staff, and a visitor centre the size of a small supermarket for the public to learn about the foundation’s good works. On the windows of the unfinished visitor centre, there are quotes from selected thinkers. One is from the famous American anthropologist Margaret Mead: “Never doubt that a small group of thoughtful, committed individuals can change the world; indeed, it’s the only thing that ever has.”
The way Gates and his elite staff have chosen to try to do so is by running their charity as a kind of business. Edwards calls this approach – increasingly popular at private foundations funded by business-people – philanthrocapitalism; others call it “venture philanthropy”. Steiner explains: “Sitting here in Seattle, we’re not going to solve Africa’s problems. Africans are going to solve Africa’s problems. We’ve got to find the Africans.” Often, this means the foundation mounting competitions for grant applications, and giving money to the winners, which usually means the most “pioneering” (Steiner’s word) and those that promise to fulfil a need not met by other charities.
Foundation staff describe this process, and indeed all their work, in business-school language: achieving “leverage”, building the foundation “brand”, serving “markets” and “customers”. Or they use the language of management consultancy and computing: “Bill is about numbers,” says Steiner. “He wants to see the data. He values data more than ideology.” Like all the foundation staff I meet, Steiner is personable and thoughtful, sitting tieless in his modest office. And like the others, he is both intensely idealistic and close to disdainful about the older, less business-orientated charity models. In his field of agricultural aid, he says, “We need a lot of smarter ways of doing things. We can’t do things the same old way . . . The people who’ve been in the field for so long [for other charities] don’t embrace how much transformation can happen. You walk in there as clear-eyed as you can . . . And [you] are basically optimistic that people want to improve their lives. You enable them with technology and knowledge, and great and wonderful things can happen.”
He looks into the middle distance as the soft Seattle summer drizzle hangs outside his window. With his strong gaze and open-necked striped shirt, his shelves crammed with agriculture books and box files, his line of jars filled with brightly coloured seeds on a table against the wall, and his slightly impatient body language, as if just about to set off on another of his frequent trips to Africa or Asia, he seems a little like a high-minded Victorian explorer.
But Steiner and his colleagues are probably more aware of their limitations. There is a problem with the Gates foundation that its staff, for the time being, appear to grasp better than its critics. For all the charity’s resources and connections, for all the attendant risks of over-confidence and over-mightiness, on the ground in Africa or Asia the foundation’s immense-sounding grants are a miniscule fraction of what is required to create a fairer world. “In agriculture,” says Steiner, “the problem’s this big” – he throws out his long arms – “and our resources are this big” – he pinches an inch of air between a finger and thumb. With an ex-management consultant’s preciseness, he concludes: “We estimate we can probably be 3-5% of the overall solution.” Then he abruptly gets up from the meeting table, turns away from me without a goodbye handshake, and goes back to his desk and computer. At the Gates foundation, they are very keen that meetings do not overrun. There is much work to be done.
WEALTHIEST MONARCHS
http://en.rian.ru/photolents/20100713/159795483.html
“Forbes unveiled a list of the world’s wealthiest monarchs. 82-year-old Thai King Phumiphon Adunyadet tops the list with $30 billion in wealth. He is not only the wealthiest but also the “longest reigning.” He has sat on the Thai throne since 1946.”
“The crowd at the inaugural event added up to a list that would make any charity – or any conspiracy theorist – swoon. Left to right: Bill Gates, Oprah Winfrey, Warren Buffett, Eli and Edythe Broad, Ted Turner, David Rockefeller, Chuck Feeney, Michael Bloomberg, George Soros, Julian Robertson, John and Tashia Morgridge, Pete Peterson”
RALLYING BILLIONAIRES to PLEDGE 50% of NET WORTH
http://money.cnn.com/magazines/fortune/fortune_archive/1986/09/29/68098/index.htm
http://money.cnn.com/magazines/fortune/fortune_archive/2006/07/10/8380864/index.htm
http://www.cnnmoney.com/2010/06/15/news/newsmakers/Warren_Buffett_Pledge_Letter.fortune/index.htm
http://features.blogs.fortune.cnn.com/2010/06/16/gates-buffett-600-billion-dollar-philanthropy-challenge/
The $600 billion challenge
by Carol J. Loomis / June 16, 2010
Just over a year ago, in May 2009, word leaked to the press that the two richest men in America, Bill Gates and Warren Buffett, had organized and presided over a confidential dinner meeting of billionaires in New York City. David Rockefeller was said to have been a host, Mayor Michael Bloomberg and Oprah Winfrey to have been among those attending, and philanthropy to have been the main subject. Pushed by the press to explain, Buffett and Gates declined. But that certainly didn’t dim the media’s interest in reaching for descriptions of the meeting: The Chronicle of Philanthropy called it “unprecedented”; both ABC News and the Houston Chronicle went for “clandestine”; a New York magazine parody gleefully imagined George Soros to have been starstruck in the presence of Oprah. One radio broadcaster painted a dark picture: “Ladies and gentlemen, there’s mischief afoot and it does not bode well for the rest of us.” No, no, rebutted the former CEO of the Bill & Melinda Gates Foundation, Patty Stonesifer, who had been at the meeting and had reluctantly emerged to combat the rumors. The event, she told the Seattle Times, was simply a group of friends and colleagues “discussing ideas” about philanthropy.
And so it was. But that discussion — to be fully described for the first time in this article — has the potential to dramatically change the philanthropic behavior of Americans, inducing them to step up the amounts they give. With that dinner meeting, Gates and Buffett started what can be called the biggest fundraising drive in history. They’d welcome donors of any kind. But their direct target is billionaires, whom the two men wish to see greatly raise the amounts they give to charities, of any and all kinds. That wish was not mathematically framed at the time of the New York meeting. But as two other U.S. dinners were held (though not leaked), Buffett and Gates and his wife, Melinda, set the goal: They are driving to get the super-rich, starting with the Forbes list of the 400 wealthiest Americans, to pledge — literally pledge — at least 50% of their net worth to charity during their lifetimes or at death.
Without a doubt, that plan could create a colossal jump in the dollars going to philanthropy, though of what size is a puzzle we’ll get to. To begin with, a word about this article you are reading. It is the first public disclosure of what Buffett and Melinda and Bill Gates are trying to do. Over the past couple of months Fortune has interviewed the three principals as the project has unfolded, as well as a group of billionaires who have signed up to add their names to the Gates/Buffett campaign. In a sense this article is also an echo of two other Fortune stories, both featuring Buffett on the cover. The first, published in 1986, was “Should you leave it all to the children?” To that query, Buffett emphatically said no. The second article, “Warren Buffett gives it away,” which appeared in 2006, disclosed Buffett’s intention to gradually give away his Berkshire Hathaway (BRK.A) fortune to five foundations, chief among them the world’s largest, the Bill & Melinda Gates Foundation. (For Buffett’s thinking on the disposition of his wealth, see “My philanthropic pledge.”)
Since then, in four years of contributions, Buffett has given the foundation $6.4 billion, not counting the 2010 gift, to be made this summer. The foundation in turn has in that same period combined Buffett’s money and its immense gifts from the Gateses to raise its level of giving to about $3 billion a year, much of it for world health. One small example: the Medicines for Malaria Venture, heavily funded by the Gates Foundation, has worked with pharmaceutical company Novartis (NVS) to develop good-tasting malaria pills and distribute them to millions of children — the principal victims of the disease — in 24 countries. Another fact about the 2006 Buffett article is that it was written by yours truly, Carol Loomis, a senior editor-at-large of Fortune. Besides that, I am a longtime friend of Buffett’s and editor of his annual letter to Berkshire’s shareholders. Through him, my husband, John Loomis, and I have also come to know Melinda and Bill Gates socially. The Loomis team has even occasionally played bridge against Warren and Bill.
All that said, the question of what philanthropy might gain from the Gates/Buffett drive rests, at its outset, on a mystery: what the wealthiest Americans are giving now. Most of them aren’t telling, and outsiders can’t pierce the veil. For that matter, the Forbes 400 list, while a valiant try, is a best-guess estimate both as to the cast of characters and as to their net worth. (Buffett says he knows of two Berkshire shareholders who should be on the list but have been missed.) As Bill Gates sums it up, “The list is imprecise.” Those qualifiers noted, the magazine stated the 2009 net worth of the Forbes 400 to be around $1.2 trillion. So if those 400 were to give 50% of that net worth away during their lifetimes or at death, that would be $600 billion. You can think of that colossal amount as what the Buffett and Gates team is stalking — at a minimum.
Leaving aside the Forbes 400 and looking simply at Internal Revenue Service data for both annual giving and estate taxes, we can piece together a picture of how far the very rich might be from a figure like that $600 billion. Start with an admirable fact about Americans as a whole: The U.S. outdoes all other countries in philanthropic generosity, annually giving in the neighborhood of $300 billion. Some of that gets reported as charitable deductions on the tax filings made by individuals. But taxpayers at low income levels don’t tend to itemize, taking the standard deduction instead. At higher income levels, charitable gift data begin to mean something. To take one example for 2007 (the latest data available), the 18,394 individual taxpayers having adjusted gross income of $10 million or more reported charitable gifts equal to about $32.8 billion, or 5.84% of their $562 billion in income.
And billionaires? Here, the best picture — though it’s flawed — emerges from statistics that the IRS has for almost two decades been releasing on each year’s 400 largest individual taxpayers, a changing universe obviously. The decision of the government to track this particular number of citizens may or may not have been spurred by the annual publication of the Forbes list. In any case, the two 400 batches, though surely overlapping, cannot be identical — for one reason because the IRS data deal with income, not net worth. The IRS facts for 2007 show that the 400 biggest taxpayers had a total adjusted income of $138 billion, and just over $11 billion was taken as a charitable deduction, a proportion of about 8%. The amount deducted, we need quickly to add, must be adjusted upward because it would have been limited for certain gifts, among them very large ones such as Buffett’s $1.8 billion donation that year to the Gates Foundation. Even so, it is hard to imagine the $11 billion rising, by any means, to more than $15 billion. If we accept $15 billion as a reasonable estimate, that would mean that the 400 biggest taxpayers gave 11% of their income to charity — just a bit more than tithing.
Is it possible that annual giving misses the bigger picture? One could imagine that the very rich build their net worth during their lifetimes and then put large charitable bequests into their wills. Estate tax data, unfortunately, make hash of that scenario, as 2008 statistics show. The number of taxpayers making estate tax filings that year was 38,000, and these filers had gross estates totaling $229 billion. Four-fifths of those taxpayers made no charitable bequests at death. The 7,214 who did make bequests gave $28 billion. And that’s only 12% of the $229 billion gross estate value posted by the entire 38,000. All told, the data suggest that there is a huge gap between what the very rich are giving now and what the Gateses and Buffett would like to suggest is appropriate — that 50%, or better, of net worth. The question is how many people of wealth will buy their argument.
The seminal event in this campaign was that billionaires’ gathering in May 2009 — the First Supper, if you will. The Gateses credit Buffett with the basic idea: that a small group of dedicated philanthropists be somehow assembled to discuss strategies for spreading the gospel to others. The Gateses proceeded to arrange the event. Bill Gates says, with a grin, “If you had to depend on Warren to organize this dinner, it might never have happened.” In his office, meanwhile, Buffett scrawled out a name for a new file, “Great Givers.” The first item filed was a copy of a March 4 letter that Buffett and Gates sent to the patriarch of philanthropy, David Rockefeller, to ask that he host the meeting. Rockefeller, now 95, told Fortune that the request was “a surprise but a pleasure.” As a site for the event, he picked the elegant and very private President’s House at Rockefeller University in New York City, whose board he has been on for 70 years. He also tapped his son David Jr., 68, to go with him to the meeting.
The event was scheduled for 3 p.m. on Tuesday, May 5 — a day urgently desired by Bill Gates, who wanted to fit the meeting into a short U.S. break he’d be taking from a three-month European stay with his family. Because Melinda elected to remain in Europe with their three children, she did not attend the first dinner, but lined herself up for any that followed. (The Gateses have considered this campaign to be a personal matter for them, not in any way a project of the Gates Foundation.) Melinda also insisted from the start that both husbands and wives be invited to the dinners, sure that both would be important to any discussion. Her reasoning: “Even if he’s the one that made the money, she’s going to be a real gatekeeper. And she’s got to go along with any philanthropic plan, because it affects her and it affects their kids.”
The letter of invitation, dated March 24, went to more people than could come. But the hosts and guests who arrived on May 5 certainly had enough economic tickets to be there: a combined net worth of maybe $130 billion and a serious history of having depleted that amount by giving money to charity. Leaving aside the semi-observers, Patty Stonesifer and David Rockefeller Jr., there were 14 people present, starting with the senior Rockefeller, Buffett, and Gates. The local guests included Mayor Bloomberg; three Wall Streeters, “Pete” Peterson, Julian Robertson, and George Soros; and Charles “Chuck” Feeney, who made his money as a major owner of Duty Free Shoppers and has so far given away $5 billion through his foundations, called Atlantic Philanthropies. When Feeney was dropped from the Forbes 400 in 1997, the magazine explained his departure in words not often hauled out for use: “Gave bulk of holdings to charity.” The out-of-towners included Oprah, Ted Turner, and two California couples, Los Angeles philanthropists Eli and Edythe Broad, and Silicon Valley’s John and Tashia Morgridge, whose fortune came from Cisco Systems (CSCO). Both the Broads and the Morgridges had equivocated over whether to accept the invitation, regarding the trip as an inconvenience. But there were the signatures at the bottom of the letter — from left to right, Rockefeller, Gates, Buffett. “Impressive,” Eli Broad thought.
So on the appointed day the Broads found themselves seated with everyone else around a big conference table, wondering what came next. They mainly got that message from Buffett, whose quick sense of humor left him playing, says David Rockefeller Jr., “the enlivener role.” He remembers Buffett as keeping the event from being “too somber” and “too self-congratulatory.” Buffett set the ball rolling by talking about philanthropy, describing the meeting as “exploratory,” and then asking each person, going around the table, to describe his or her philosophy of giving and how it had evolved. The result was 12 stories, each taking around 15 minutes, for a total of nearly three hours. But most participants whom Fortune has talked to found the stories riveting, even when they were familiar. David Rockefeller Sr. described learning philanthropy at the knees of his father and grandfather. Ted Turner repeated the oft-told tale of how he had made a spur-of-the-moment decision to give $1 billion to the United Nations. Some people talked about the emotional difficulty of making the leap from small giving to large. Others worried that their robust philanthropy might alienate their children. (Later, recalling the meeting, Buffett laughed that it had made him feel like a psychiatrist.)
The charitable causes discussed in those stories covered the spectrum: education, again and again; culture; hospitals and health; the environment; public policy; the poor generally. Bill Gates, who found the whole event “amazing,” regarded the range of causes as admirable: “The diversity of American giving,” he says, “is part of its beauty.” At the dinner that followed, the conversation turned specifically to how giving by the rich could be increased. The ideas advanced included national recognition of great philanthropists (presidential medals, for example), or a film, or a philanthropy guidebook, or a conference of the rich. There was no talk of a pledge. Of the dinner, the junior Rockefeller says, “The most important thing my dad and I came away with was that increasing giving would take work by many in that room — delicate, and probably prolonged, one-on-one work.”
The dinner, of course, had its unexpected coda: the leak. The leaker, with little doubt, was Chuck Feeney, and the leakee was his longtime friend Niall O’Dowd, the New York publisher behind the grandly unknown IrishCentral.com. (Fortune did not succeed in reaching Feeney; of our account, O’Dowd said, “I can’t confirm that.”) On May 18, two weeks after the meeting, IrishCentral.com posted an article of 14 short paragraphs headlined “Secret meeting of world’s richest people held in New York.”With that, the fame of the website spiked, as the rest of the press picked up the news and ran with it. The IrishCentral article exhibited some confusion about which Rockefeller starred at the dinner, or was even there, but otherwise provided the names of all the participants — with the notable exception of Feeney, who apparently didn’t realize he looked more conspicuous to the others by being left out. Feeney, however, appears to have been quoted anonymously in the piece, once as an “attendee” who thought Gates the most impressive speaker of the day, Turner the most outspoken (surprise!), and Buffett the most insistent on his agenda for change. In a second instance, Feeney was a good bet to have been the awed “participant” who extolled his fellow guests: “They were all there, the great and the good.”
The main effect of the leak was to place a “cone of silence” — that’s a description from the Gates camp — over everything that transpired in the giving campaign over the next year. But there was certainly action, including a few small dinners abroad. Bill and Melinda Gates hosted a dinner in London, and Bill held a few others in India and China. Raising the philanthropic bar in foreign countries is a special challenge: Dynastic wealth is widely taken for granted; tax laws do not commonly allow deductions for gifts to charity; a paucity of institutions and organizations ready for gifts makes knowing whom to give to just not that obvious. Nonetheless, were the Gateses and Buffett to succeed in their campaign in the U.S., they would probably take it overseas. But as last summer and fall progressed, Buffett and the Gateses did not even have a plan for how the campaign was to be structured. In this vacuum the idea of a pledge took hold and gained strength. It helped that more dinners were to be held. At them, says Melinda, the three principals would “float the pledge idea to see if it would fly.”
There then occurred the second and third U.S. dinners, most of whose guests have not been publicly outed because of the cone of silence. Secrecy, a Gates spokesman says, is partly a bow to moguls who have been exposed to the philanthropic sales pitch but would be embarrassed to have been identified in case they chose not to step up to the challenge. In any event, the names of some of the participants are known. The noted philanthropists at the second dinner, held at the New York Public Library in November last year, included New York investment banker Kenneth Langone and his wife, Elaine, and H.F. “Gerry” Lenfest and his wife, Marguerite, from Philadelphia. Lenfest got rich when he sold his Pennsylvania cable television company to Comcast (CMCSA) in 2000, netting $1.2 billion for himself and his family. He promptly vowed that he would give most of it to charity in his lifetime. Now 80, he has so far meted out $800 million, a good part of it to schools he attended (Columbia Law School, Washington and Lee, Mercersburg Academy).
Lenfest’s favorite moment at the November dinner was Buffett’s declaration that Marguerite Lenfest had put forward the best idea of the evening when she said that the rich should sit down, decide how much money they and their progeny need, and figure out what to do with the rest of it. Says Lenfest: “The value of Buffett and Gates is that they’re going to make people sit down and think these things through.” The Third Supper, held in December in Menlo Park, Calif., at the Rosewood Sand Hill hotel, is known as the Bay Area dinner but drew from all over the state, including its entertainment precincts. In attendance were some veteran philanthropists, including venture capitalist John Doerr of Kleiner Perkins and his wife, Ann, and the Morgridges, who had selected the meeting site. This dinner was somewhat different from the other two, says Melinda Gates, because a few people there were relatively new to huge wealth and were still forming their opinions about giving. Talk went on for hours, so long that the beef being prepared for dinner became somewhat overcooked. This is reported to have dismayed Rosewood’s management, which may have noticed that the crowd in the Dogwood room was worth having back.
The dinner also brought out some of the fears that people have about philanthropy. What does going public with big gifts do to the peace in your life? Won’t pleas from charities be unending? How do you deal with giving internationally, which too often seems like throwing money down a hole? These are valid concerns, say the Gateses, the kind raised by people who want to feel as smart about giving as they were about making their money. But the questions didn’t stop the two from plugging the satisfactions of philanthropy. At those dinners, says Bill, “no one ever said to me, ‘We gave more than we should have.'” Nor did the idea of a pledge get shot down at those dinners. It “floated” nicely, in other words. So as 2010 arrived, a pledge became the strategy. The idea of aiming for a 50% slice of net worth was pragmatically pulled from the sky, being less than the principals would have liked to ask for but perhaps as much, at least initially, as they can get. The pledges, meanwhile, were never envisioned as legal contracts but rather moral obligations to be both memorialized in writing and taken very seriously. They are in fact to be posted on a new website, givingpledge.org, whose construction Melinda Gates oversaw. The 99% pledge that Buffett is making is likely to be the No. 1 document on the website, if he is not beaten out by his Seattle friends.
Enthusiastic about leading the search for Great Givers, the Gateses and Buffett nonetheless have wanted a phalanx of strong supporters. Already committed to at least a 50% pledge are the Broads, the Doerrs, the Lenfests, and the Morgridges. With the online publication of this article, moreover, the three principals will send e-mails and make calls to other billionaires judged likely prospects. A bit later, all of the pledgers may join in sending a letter to a large number of other billionaires, asking them to join the growing crowd. In the fall there may even be a Great Givers conference. The definition of success in this venture may take years to figure out, but each of the principals has reflections about the matter. Buffett knows that everyone rich has thought about what to do with his or her money: “They may not have reached a decision about that, but they have for sure thought about it. The pledge that we’re asking them to make will put them to thinking about the whole issue again.” He warns, most of all, against the rich delaying the decision of what to do with their money: “If they wait until they’re making a final will in their nineties, the chance of their brainpower and willpower being better than they are today is nil.”
Bill Gates regards the 50% as a “low bar” encouraging high participation. People, he thinks, may be drawn in by that proportion and then surprise themselves and find they are giving at higher levels. “This is about moving to a different realm,” he thinks, and it will take time for everything to sort out. Melinda Gates separates the near-term from the far. There are so many reasons that rich people don’t give, she says: They don’t want to plan for their death; they worry that they’ll need to hire someone to help with the work; they just don’t want to take the time to think about it all. So the initial goal of the pledge campaign, she thinks, must be simply to cut through that and get them moving in the direction of giving. And eventually? “Three to five years down the road, we need to have a significant number of billionaires signed up. That would be success.” Society cannot help but be a beneficiary here, by virtue of at least some dollars and perhaps many. Nor will it be just the very rich who will perhaps bend their minds to what a pledge of this kind means. It could also be others with less to give but suddenly more reason to think about the rightness of what they do.
‘EXTEND and PRETEND’
http://online.wsj.com/article/SB10001424052748704764404575286882690834088.html
To Fix Sour Property Deals, Lenders ‘Extend and Pretend’
By CARRICK MOLLENKAMP And LINGLING WEI / JULY 7, 2010
Some banks have a special technique for dealing with business borrowers who can’t repay loans coming due: Give them more time, hoping things improve and they can repay later. Banks call it a wise strategy. Skeptics call it “extend and pretend.” Banks are applying it, in particular, to commercial real-estate lending, where, during the boom, optimistic borrowers got in over their heads to the tune of tens of billions of dollars. A big push by banks in recent months to modify such loans—by stretching out maturities or allowing below-market interest rates—has slowed a spike in defaults. It also has helped preserve banks’ capital, by keeping some dicey loans classified as “performing” and thus minimizing the amount of cash banks must set aside in reserves for future losses.
Restructurings of nonresidential loans stood at $23.9 billion at the end of the first quarter, more than three times the level a year earlier and seven times the level two years earlier. While not all were for commercial real estate, the total makes clear that large numbers of commercial-property borrowers got some leeway. But the practice is creating uncertainties about the health of both the commercial-property market and some banks. The concern is that rampant modification of souring loans masks the true scope of the commercial property market weakness, as well as the damage ultimately in store for bank balance sheets.
In Atlanta, Georgian Bank lent $13.5 million to a company in late 2007, some of it to buy land for a 53-story luxury Mandarin Oriental hotel and condo development. The loan came due in November 2008, but the bank extended its maturity date by a year. The bank extended it again to May 2010, with an option for a further extension to November 2010, according to court documents.
Georgia’s banking regulator shut down the bank last September. A subsequent U.S. regulatory review cited “lax” loan underwriting and “an aggressive growth strategy…that coincided with declining economic conditions in the Atlanta metropolitan area.” Some of Georgian Bank’s assets were assumed by First Citizens Bank and Trust Co. of Columbia, S.C., which began foreclosure proceedings on the still-unbuilt luxury development. The borrowers contested the move, and settlement talks are in progress.
Also in Atlanta, Bank of America Corp. has extended a loan twice for a high-end shopping and residential project. Three years after a developer launched the Streets of Buckhead project as a European-style shopping district, all there is to show for it is a covey of silent cranes and a fence. The developer, Ben Carter, says he is in final negotiations for an investor to come in and inject $200 million into the languishing development.
Regulators helped spur banks’ recent approach to commercial real estate by crafting new guidelines last October. They gave banks a variety of ways to restructure loans. And they allowed banks to record loans still operating under the original terms as “performing” even if the value of the underlying property had fallen below the loan amount—which is an ominous sign for ultimate repayment. Although regulators say banks shouldn’t take the guidelines as a signal to cut borrowers more slack, it appears some did.
Banks hold some $176 billion of souring commercial-real-estate loans, according to an estimate by research firm Foresight Analytics. About two-thirds of bank commercial real-estate loans maturing between now and 2014 are underwater, meaning the property is worth less than the loan on it, Foresight data show. U.S. commercial-real-estate values remain 42% below their October 2007 peak and only slightly above the low they hit in October 2009, according to Moody’s Investors Service. In the first quarter, 9.1% of commercial-property loans held by banks were delinquent, compared with 7% a year earlier and just 1.5% in the first quarter of 2007, according to Foresight.
Holding off on foreclosing is often good business, says Mark Tenhundfeld, senior vice president at the American Bankers Association. “It can be better for a bank to extend a loan and increase the chance that the bank will be repaid in full rather than call the loan due now and dump more property on an already-depressed market,” he says.
But continuing to extend loans and otherwise modify them, rather than foreclosing, amounts to a bet that the economy will rebound enough to enable clients to find new demand for the plethora of offices, hotels, condos and other property on which they borrowed. If it doesn’t work out this way, the banks will end up having to write off the loans anyway. At that point, if they haven’t been setting aside sufficient cash all along for potential losses on such loans, the banks will face a hit to their earnings.
Banks’ reluctance to bite the bullet on some deteriorating commercial real estate can have economic repercussions. The readiness to stretch out loans puts a floor under commercial real estate and keeps it from hitting bottom, which may be a precondition for a robust revival. More broadly, the failure to get the loans off banks’ books tends to deter new lending to others. It’s a pattern somewhat reminiscent, although on a lesser scale, of the way Japanese banks’ failure to write off souring loans in the 1990s contributed to years of stagnation.
It’s a Catch-22 for banks. As long as some of their capital is tied up in real-estate loans that are struggling—and as the banks see a pipeline of still-more sour real-estate debt that will mature soon—their lending is likely to remain constricted. But to wipe the slate clean by writing off many more loans would mean an even bigger hit to their capital. “It does not take much of a write-down to wipe out capital,” says Christopher Marinac, managing principal at FIG Partners LLC, a bank research and investment firm. Federal bank regulators tackled the issues in October with a 33-page set of guidelines. Bank regulators have said they were concerned about commercial-property losses and debts coming due on commercial property.
Another problem they sought to resolve was that banks and their examiners weren’t always on the same page. In some cases banks weren’t recognizing loan problems, while in other cases, tough bank examiners were forcing banks to downgrade loans the bankers believed were still good. The guidance was intended “to promote both prudent commercial real-estate loan workouts by banks and balanced and consistent reviews of these loans by the supervisory agencies,” said Elizabeth Duke, a Federal Reserve governor, in a March speech. The guidelines came from the Federal Financial Institutions Examination Council, which includes the Fed, the Federal Deposit Insurance Corp. and the Comptroller of the Currency.
Although one goal was greater consistency in the treatment of commercial real-estate loans, in practice, the guidelines appear to have fed confusion in the markets about how banks are dealing with commercial real-estate debt. “I just don’t believe that the standard is being applied consistently across the industry,” says Edward Wehmer, chief executive of Wintrust Financial Corp. in Lake Forest, Ill. In a May conference call with 1,400 bank executives, regulators sought to clear up confusion. “We don’t want banks to pretend and extend,” Sabeth Siddique, Federal Reserve assistant director of credit risk, said on the call. “We did hear from investors and some bankers interpreting this guidance as a form of forbearance, and let me assure you it’s not.”
Restructurings increased at some banks, like BB&T Corp. of Winston-Salem, N.C. Its total of one type of restructured commercial loan hit $969 million in recent months, the bank reported in April. That was a huge jump from six months earlier, when the figure was just $68 million. The increase was “basically a function of implementing the new regulatory guidance,” the bank’s finance chief, Daryl Bible, told investors in May. “We are working with our customers trying to keep them in the loans.”
BB&T’s report showed a significant number of cases where it was extending loan maturities and allowing interest rates not widely available in the market for loans of similar risk. Banks don’t have to disclose how terms on their loans have changed, making it hard to know whether they are setting aside enough cash for possible losses. In a large proportion of cases, modifying the terms of loans ultimately isn’t enough to save them. At the end of the first quarter, 44.5% of debt restructurings were 30 days or more delinquent or weren’t accruing interest, up from 28% the first quarter of 2008.
A case in Portland, Ore., shows how banks can keep treating a commercial loan as current, despite the difficulties of the underlying project. A client called Touchmark Living Centers Inc. in 2007 borrowed $15.9 million, in two loans, to buy land for a development. The borrower planned to retire the loans at the end of the year by obtaining construction financing to build the Touchmark Heights community for empty-nesters. Because the raw land produced no income, the lender, Umpqua Bank, had provided “interest reserves” with which the developer could cover interest payments while obtaining permits and preparing to build. The bank extended Touchmark a $350,000 interest reserve—in effect increasing what Touchmark owed by that amount.
In December 2007, the U.S. economy slipped into recession. When the loans came due that month, Touchmark didn’t pay them off. Umpqua extended the maturity to May 31, 2008. The bank also added $600,000 to the interest reserves. Though supplying interest reserves is common at the outset of a loan, when an unbuilt project can’t produce any income with which to pay debt service, replenishing interest reserves is frowned on by regulators. Asked to comment, a spokeswoman for the bank said, “Umpqua and Touchmark had determined that the project was still viable but not yet ready for development.” Touchmark said it didn’t pursue construction financing at that time because “it was not prudent to proceed with developing the property until the economy improves,” as a spokeswoman put it.
In 2008 the bank extended the loans again, to April 2009. During this time, Touchmark began paying interest on the loans out of its own pocket. Then in May 2009, Umpqua restructured the loans, lumping what was owed into one $15 million loan that required regular payments on both interest and principal. Touchmark paid down the principal a little and Umpqua set a new maturity date—May 5, 2012.
Meanwhile, the value of the land Touchmark had borrowed to purchase has been eroding. The bank says it was worth $23.5 million by the most recent independent appraisal, but that was in 2008. The county assessment and taxation department pegged the land’s value at about $20 million at the start of 2009. An appraiser for the department estimates raw-land values in the area fell by another 25% to 30% last year, Touchmark executives declined to estimate the land’s value. They said the property has retained “significant” value, partly because of its location, with a view of 11,240-foot Mount Hood. Umpqua Bank says the loan is accruing interest, and it expects the loan to be repaid.
DEFAULT?
http://www.econlib.org/library/Columns/y2009/Hummeltbills.html
Why Default on U.S. Treasuries is Likely
by Jeffrey Rogers Hummel
“Buried within the October 3, 2008 bailout bill was a provision permitting the Fed to pay interest on bank reserves. Within days, the Fed implemented this new power, essentially converting bank reserves into more government debt. Now, any seigniorage that government gains from creating bank reserves will completely vanish or be greatly reduced.”
Almost everyone is aware that federal government spending in the United States is scheduled to skyrocket, primarily because of Social Security, Medicare, and Medicaid. Recent “stimulus” packages have accelerated the process. Only the naively optimistic actually believe that politicians will fully resolve this looming fiscal crisis with some judicious combination of tax hikes and program cuts. Many predict that, instead, the government will inflate its way out of this future bind, using Federal Reserve monetary expansion to fill the shortfall between outlays and receipts. But I believe, in contrast, that it is far more likely that the United States will be driven to an outright default on Treasury securities, openly reneging on the interest due on its formal debt and probably repudiating part of the principal.
To understand why, we must look at U.S. fiscal history. Economists refer to the revenue that government or its central bank generates through monetary expansion as seigniorage. Outside of America’s two hyperinflations (during the Revolution and under the Confederacy during the Civil War), seigniorage in this country peaked during World War II, when it covered nearly a quarter of the war’s cost and amounted to about 12 percent of Gross Domestic Product (GDP). By the Great Inflation of the 1970s, seigniorage was below two percent of federal expenditures or less than half a percent of GDP.1 This was partly a result of globalization, in which international competition disciplines central banks. And it also was the result of sophisticated financial systems, with fractional reserve banking, in which most of the money that people actually hold is created privately, by banks and other financial institutions, rather than by government. Consider how little of your own cash balances are in the form of government-issued Federal Reserve notes and Treasury coin, rather than in the form of privately created bank deposits and money market funds. Privately created money, even when its quantity expands, provides no income to government. Consequently, seigniorage has become a trivial source of revenue, not just in the United States, but also throughout the developed world. Only in poor countries, such as Zimbabwe, with their primitive financial sectors, does inflation remain lucrative for governments.
The current financial crisis, moreover, has reinforced the trend toward lower seigniorage. Buried within the October 3, 2008 bailout bill, which set up the Troubled Asset Relief Program (TARP), was a provision permitting the Fed to pay interest on bank reserves, something other major central banks were doing already. Within days, the Fed implemented this new power, essentially converting bank reserves into more government debt. Fiat money traditionally pays no interest and, therefore, allows the government to purchase real resources without incurring any future tax liability. Federal Reserve notes will, of course, continue to earn no interest. But now, any seigniorage that government gains from creating bank reserves will completely vanish or be greatly reduced, depending entirely on the differential between market interest rates on the remaining government debt and the interest rate on reserves. The lower is this differential, the less will be the seigniorage. Indeed, this new constraint on seigniorage becomes tighter as people replace the use of currency with bank debit cards and other forms of electronic fund transfers. In light of all these factors, even inflation well into the double digits can do little to alleviate the U.S. government’s potential bankruptcy.
What about increasing the proceeds from explicit taxes? Examine Graph 1, which depicts both federal outlays and receipts as a percent of GDP from 1940 to 2008. Two things stand out. First is the striking behavior of federal tax revenue since the Korean War. Displaying less volatility than expenditures, it has bumped up against 20 percent of GDP for well over half a century. That is quite an astonishing statistic when you think about all the changes in the tax code over the intervening years. Tax rates go up, tax rates go down, and the total bite out of the economy remains relatively constant. This suggests that 20 percent is some kind of structural-political limit for federal taxes in the United States. It also means that variations in the deficit resulted mainly from changes in spending rather than from changes in taxes. The second fact that stands out in the graph is that federal tax revenue at the height of World War II never quite reached 24 percent of GDP. That represents the all-time high in U.S. history, should even the 20-percent-of-GDP post-war barrier prove breachable.2
Compare these percentages with that of President Barack Obama’s first budget, which is slated to come in at above 28 percent of GDP. Although this spending surge is supposed to be significantly reversed when the recession is over, the administration’s own estimates have federal outlays never falling below 22 percent of GDP. And that is before the Social Security and Medicare increases really kick in. In its latest long-term budget scenarios, the Congressional Budget Office (CBO), not known for undue pessimism, projects that total federal spending will rise over the next 75 years to as much as 35 percent of GDP, not counting any interest on the accumulating debt, which critically varies with how fast tax revenues rise. However, the CBO’s highest projection for tax revenue over the same span reaches a mere 26 percent of GDP. Notice how even that “optimistic” projection assumes that Americans will put up with, on a regular peacetime basis, a higher level of federal taxation than they briefly endured during the widely perceived national emergency of the Second World War. Moreover, once you add in the interest on the growing debt because of the persistent deficits, federal expenditures in 2083, according to the CBO, could range anywhere between 44 and 75 percent of GDP.3
We all know that there is a limit to how much debt an individual or institution can pile on if future income is rigidly fixed. We have seen why federal tax revenues are probably capped between 20 and 25 percent of GDP; reliance on seigniorage is no longer a viable option; and public-choice dynamics tell us that politicians have almost no incentive to rein in Social Security, Medicare, and Medicaid. The prospects are, therefore, sobering. Although many governments around the world have experienced sovereign defaults, U.S. Treasury securities have long been considered risk-free. That may be changing already. Prominent economists have starting considering a possible Treasury default, while the business-news media and investment rating agencies have begun openly discussing a potential risk premium on the interest rate that the U.S. government pays. The CBO estimates that the total U.S. national debt will approach 100 percent of GDP within ten years, and when Japan’s national debt exceeded that level, the ratings of its government securities were downgraded.
The much (unfairly) maligned credit default swaps (CDS) in February 2009 were charging more for insurance against a default on U.S. Treasuries than for insurance against default of such major U.S. corporations as Pepsico, IBM, and McDonald’s. Because the premiums and payoffs of the CDS on U.S. Treasury securities are denominated in Euros, the annual premiums also reflect exchange-rate risk, which is probably why, with the subsequent modest decline in the dollar, CDS premiums for ten-year Treasuries fell from 100 basis points to almost 30.4 But you can make a plausible case that CDS underestimate the probability of a Treasury default since such a default could easily have far reaching financial repercussions, even hurting the counterparties providing the insurance and impinging on their ability to make good on their CDS. Surely the purchasers of the U.S. Treasury CDS have not overlooked this risk, which would be reflected in a lower annual premium for less-valuable insurance.
Predicting an ultimate Treasury default is somewhat empty unless I can also say something about its timing. The financial structure of the U.S. government currently has two nominal firewalls. The first, between Treasury debt and unfunded liabilities, is provided by the trust funds of Social Security, Medicare, and other, smaller federal insurance programs. These give investors the illusion that the shaky fiscal status of social insurance has no direct effect on the government’s formal debt. But according to the latest intermediate projections of the trustees, the Hospital Insurance (HI-Medicare Part A) trust fund will be out of money in 2017, whereas the Social Security (OASDI) trust funds will be empty by 2037.5 Although other parts of Medicare are already funded from general revenues, when HI and OASDI need to dip into general revenues, the first firewall is gone. If investors respond by requiring a risk premium on Treasuries, the unwinding could move very fast, much like the sudden collapse of the Soviet Union. Politicians will be unable to react. Obviously, this scenario is pure speculation, but I believe it offers some insight into the potential time frame.
The second financial firewall is between U.S. currency and government debt. It is not literally impossible that the Federal Reserve could unleash the Zimbabwe option and repudiate the national debt indirectly through hyperinflation, rather than have the Treasury repudiate it directly. But my guess is that, faced with the alternatives of seeing both the dollar and the debt become worthless or defaulting on the debt while saving the dollar, the U.S. government will choose the latter. Treasury securities are second-order claims to central-bank-issued dollars. Although both may be ultimately backed by the power of taxation, that in no way prevents government from discriminating between the priority of the claims. After the American Revolution, the United States repudiated its paper money and yet successfully honored its debt (in gold). It is true that fiat money, as opposed to a gold standard, makes it harder to separate the fate of a government’s money from that of its debt. But Russia in 1998 is just one recent example of a government choosing partial debt repudiation over a complete collapse of its fiat currency.
Admittedly, seigniorage is not the only way governments have benefited from inflation. Inflation also erodes the real value of government debt, and if the inflation is not fully anticipated, the interest the government pays will not fully compensate for the erosion. This happened during the Great Inflation of the 1970s, when investors in long-term Treasury securities earned negative real rates of return, generating for the government maybe one percent of GDP, or about twice as much implicit revenue as came from seigniorage. But today’s investors are far savvier and less likely to get caught off guard by anything less than hyperinflation. To be clear, I am not denying that a Treasury default might be accompanied by some inflation. Inflationary expectations, along with the fact that part of the monetary base is now de facto government debt, can link the fates of government debt and government money. This is all the more reason for the United States to try to break the link and maintain the second financial firewall. We still may end up with the worst of both worlds: outright Treasury default coupled with serious inflation. I am simply denying that such inflation will forestall default.
Still unconvinced that the Treasury will default? The Zimbabwe option illustrates that other potential outcomes, however unlikely, are equally unprecedented and dramatic. We cannot utterly rule out, for instance, the possibility that the U.S. Congress might repudiate a major portion of promised benefits rather than its debt. If it simply abolished Medicare outright, the unfunded liability of Social Security would become tractable. Indeed, one of the current arguments for the adoption of nationalized health care is that it can reduce Medicare costs. But this argument is based on looking at other welfare States such as Great Britain, where government-provided health care was rationed from the outset rather than subsidized with Medicare. Rationing can indeed drive down health-care costs, but after more than forty years of subsidized health care in the United States, how likely is it that the public will put up with severe rationing or that the politicians will attempt to impose it? And don’t kid yourself; the rationing will have to be quite severe to stave off a future fiscal crisis.
Other welfare States have higher taxes as a proportion of GDP, with Sweden and Denmark in the lead at nearly 50 percent.6 Can I really be confident that the United States will never follow their example? Let us ignore all the cultural, political, and economic differences between small, ethnically-unified European States and the United States. We still must factor in the take of state and local governments, which, together with the federal government, raises the current tax bite in the United States to 28 percent of GDP, only five percentage points below that of Canada. Recall that the CBO projects that federal spending alone for 2082 will reach almost 35 percent of GDP, excluding rising interest on the national debt. Thus, if taxes were to rise pari passu with spending, the United States might be able to forestall bankruptcy with a total tax burden, counting federal, state, and local, of around 45 percent of GDP—15 percentage points higher than the combined total at its World War II peak, higher than in the United Kingdom and Germany today, and nearly dead even with Norway and France. However, if there is any significant lag between expenditure and tax increases, the increased debt would cause the proportion to rise even more. Furthermore, this estimate relies on the CBO’s economic and demographic assumptions about the future, along with the assumption of absolutely no increase in state and local taxation as a percent of GDP. More-pessimistic assumptions also drive the percentage up.
Even conceding that federal taxes might rise rapidly enough to a level noticeably higher than during World War II overlooks an important consideration: All the social democracies are facing similar fiscal dilemmas at almost the same time. Pay-as-you go social insurance is just not sustainable over the long run, despite the higher tax rates in other welfare States. Even though the United States initiated social insurance later than most of these other welfare States, it has caught up with them because of the Medicare subsidy. In other words, the social-democratic welfare State will come to end, just as the socialist State came to an end. Socialism was doomed by the calculation problem identified by Ludwig Mises and Friedrich Hayek. Mises also argued that the mixed economy was unstable and that the dynamics of intervention would inevitably drive it towards socialism or laissez faire. But in this case, he was mistaken; a century of experience has taught us that the client-oriented, power-broker State is the gravity well toward which public choice drives both command and market economies. What will ultimately kill the welfare State is that its centerpiece, government-provided social insurance, is simultaneously above reproach and beyond salvation. Fully-funded systems could have survived, but politicians had little incentive to enact them, and much less incentive to impose the huge costs of converting from pay-as-you-go. Whether this inevitable collapse of social democracies will ultimately be a good or bad thing depends on what replaces them.
DEBT RELIEF, as in CONGO & LIBERIA
http://www.eurodad.org/whatsnew/reports.aspx?id=3946
http://jubileeusa.typepad.com/blog_the_debt/2010/07/liberias-debt-cancelled-29th-and-worlds-poorest-country-reaches-completion-point.html
http://www.jubileeusa.org/press/press-item/article/imf-takes-two-steps-forward-and-one-step-back-on-haiti.html
http://www.imf.org/external/pubs/ft/survey/so/2010/car062910a.htm
http://www.imf.org/external/np/exr/facts/hipc.htm
http://www.imf.org/external/np/sec/pr/2010/pr10267.htm
http://www.imf.org/external/np/sec/pr/2010/pr10274.htm
http://www.eurodad.org/whatsnew/articles.aspx?id=4198
Liberia and DRC debt relief sparks mixed reactions
by Øygunn Sundsbø Brynildsen / 15 July 2010
The World Bank and the IMF have granted debt relief to Liberia and the Democratic Republic of Congo (DRC). The debt relief – granted after the two countries fulfilled the conditions for completing the Highly Indebted Poor Countries Initiative (HIPC) – is essential for fighting poverty and increasing fiscal space in the two countries that rank as number 169 (Liberia) and 176 (DRC) out of the 182 countries on the UNDP 2010 Human Development Index. Despite debt relief for poor countries being much-needed and welcome, civil society regrets that the international creditors have not recognised their share of responsibility in what CSOs believe to be illegitimate debts contracted by dictatorial regimes without the consent of their peoples or their legitimate democratic representatives.
Liberia, the world’s highest debt burden
After almost two decades marked by warlords and civil war in the 1990s and early 2000s, according to the IMF Liberia had the highest debt to GDP ratio in the world . Due to the conflicts, there have been long periods when Liberia has not serviced its debt, resulting in enormous amounts of interest, sometimes many times higher than the original loan. When the government headed by Ellen Johnson-Sirleaf came to power in Liberia in 2006, Liberia resumed monthly payments of debt in arrears as a sign of good-will towards the international creditor community. However, arrears clearance detracted much needed resources for investment and infrastructure to ensure economic recovery of the war-torn country, and to provide essential services to 80 percent of Liberian citizens living below the poverty line.
The current debt relief of USD 4.6 billion reduces Liberia’s external debt stock by more than 90 percent. USD 1.5 billion is to be delivered by multilateral creditors and the remainder by bilateral and commercial creditors. Still, if all creditors live up to their commitments, the remaining debt of Liberia will amount to USD 150 million, which the country is scheduled to start servicing from the end of 2011.
International creditors turn a blind eye to reckless behaviour
Debt campaigners called for debt cancellation for Liberia and the DRC on the basis of the dubious legitimacy of these countries’ external debts. In the case of Liberia, autocrat Samuel Doe was lent money by the G8 in return for the nation’s support against Libya’s Momar Qaddafi in the 1980s. Charles Taylor was also loaned huge amounts despite the highly questionable democratic credentials of his regime. In the case of DRC, even the IMF itself warned against lending to the Mobuto regime. In 1978 Erwin Blumenthal, the IMF representative in Zaire made it clear that there was “no (repeat: no) prospect for Zaire’s creditors to get their money back in any foreseeable future”. Lenders, including the IMF and the World Bank, nevertheless continued to provide loans to the dictator regime.
The call for recognition of the illegitimacy of the DRC debt is echoed by many, including the Financial Times where William Wallis said that “The Democratic Republic of Congo’s debt burden was perhaps the most odious in Africa – a financial carbuncle from the cold war that should long since have been excised.” In this context of reckless behaviour by lenders, the decision to cancel DRC’s debt should have been an easy one. However, only after seven years of implementing reforms required by the IFIs has the DRC’s debt been alleviated. “For seven years the World Bank and IMF have been saying that the country is mismanaged. By maintaining this pressure they succeeded in heading off questions about why the debt was created in the first place and who signed off on the other side,” Michel Losembe, managing director of Citibank in Kinshasa, said to the Financial Times.
Business disputes delayed debt relief to DRC
Additional policy requirements have also come into play and delayed the long-awaited debt relief. DRC was originally scheduled for debt relief last year, but the process was put on hold because DRC considered making a mining agreement with China. This year, at Canada’s request, the World Bank postponed the decision to accept debt relief for DRC due to a dispute over mineral rights between the Canada based oil firm First Quantum and the government of DRC.
Biased judges and unfair rules
The long and bumpy road towards debt relief for DRC and Liberia demonstrates the need for an independent procedure for debt resolution and clear and binding rules for responsible lending and borrowing. Suggestions for such rules are outlined in the Eurodad Responsible Financing Charter and in the South North platform for sovereign, democratic and responsible financing. A first step towards greater justice in the sovereign debt domain is for indebted countries to undertake independent debt audits and repudiate illegitimate debts. Zimbabwe is likely to be one of the next countries to enter the HIPC initiative and hence undertake policy and economic reforms required to receive debt relief. The Zimbabwe Coalition on Debt and Development (ZIMCODD) has called for a comprehensive audit of Zimbabwe’s debt instead of entering into the HIPC initiative. In the absence of responsible lending and borrowing standards and of an independent and fair procedure for settling debt disputes, poor people in poor countries will continue to pay too high of a price for the reckless behaviour of irresponsible lenders and creditors.
DEBT JUBILEE
http://www.jubileeusa.org/
http://www.uiowa.edu/ifdebook/ebook2/biblio/biblio4.shtml
http://www.ft.com/cms/s/0/85432b32-cd32-11dd-9905-000077b07658.html
http://www.vanityfair.com/online/daily/2009/01/niall-ferguson-america-needs-to-cancel-its-debt.html
Niall Ferguson: America Needs to Cancel Its Debt
by Michael Hogan / January 20, 2009
As dedicated V.F. readers already know, Niall Ferguson “gets” the economic collapse. Now, the historian and bestselling author is sharing his insights with a new book, The Ascent of Money, and an accompanying TV special (which means regular people might actually absorb some of what he has to say). And what he has to say is rather terrifying, with profound implications for an Obama presidency and, beyond that, the future of the United States as a superpower. I tried putting my most basic questions about the economy to Ferguson, and here’s how it went:
Michael Hogan: First of all, this whole financial collapse is great timing for your book. Are you psyched?
Niall Ferguson: Well, I can say with a degree of self-satisfaction that it wasn’t luck. Two and a half years ago I decided to write this book, because I was sure that this financial crisis was going to happen, and the reason I was sure was because people kept coming up to me—whether it was investment bankers or hedge fund managers—telling me that volatility was dead that there would never be another recession. I just thought, “These people have completely disconnected from reality, and financial history is going to come back and bite them in the ass.”
MH: In the book, you identify the five stages of a bubble. What stage are we at now?
NF: We’re pretty much at the last stage, which is the Panic stage. If you remember roughly how it goes, you begin with some kind of Displacement or shift that changes the economic environment. I would say in this case, the displacement was really caused by the wall of Asian savings coming into the U.S. and keeping interest rates lower than would have normally been in the cycle. Then you get the Euphoria, which is when people say, “God, now prices can only go up, we should buy more. We should borrow more, because this is a one-way bet.” And then more and more people enter the market, first-time buyers, and that’s the classic run-up phase of the bubble. Then there’s this sort of irrational-exuberance Mania—which came at the end of ’06, when we still had property prices rising at an annualized rate of 20 percent. But then you get Distress. That’s when the insiders, the smart people, start to look at one another and say, “This is nuts, we should get out.” That’s when the John Paulsons start to short the market. And then you get the shift into downward movement of prices, which ultimately culminates in Panic, when everybody heads for the exit together. In this bubble, it happened in a strange kind of slow motion, because the game was really up in August 2007, but it wasn’t really a full-fledged panic—at least across the board—until Lehman Brothers, September 15, 2008, more than a year later. Wile E. Coyote ran off the cliff in August of ’07, but he didn’t really look down until over a year later.
MH: What do you think happens next to the stock market, to the real estate market, and to the banks?
NF: Well, they have further to fall, without doubt, because we’re going to get almost a third phase of the crisis. The third phase of the crisis is when rising unemployment starts to impact the real estate market and consumer spending generally. So, we go another leg down. Unemployment is rising at a very rapid rate. It could go as high as 10 percent, and it’s just going to keep going up in the next two quarters, maybe throughout the year, because this is bad. This is worse than the early 80s. This is as bad as it’s been since the 30s. And in those conditions, there’s going to be further negative movement of real estate prices, and further negative movement of stock prices. People are going to get horrible earnings reports, and when the earnings reports turn out to be worse than anybody expected, the prices of most corporations are going to head south. So, it’s certainly not over. Best case, the rate of decline begins to slow, so we’re not falling vertiginously. We start to fall more gradually.
MH: Is a $700 billion stimulus, like the one Obama is talking about, better than nothing?
NF: Well, it is better than nothing.
MH: Why?
NF: Well, I think we have to realize that nothing would be the Great Depression. So it will be a “success” if output only contracts by five or seven percent. It will be a “success” if unemployment only reaches 11 percent, because in the Great Depression output contracted 30 percent, and unemployment went to 25 percent. These measures that we’re taking at the moment are preventative measures. They’re really designed to prevent a complete implosion of the economy. That’s why I call this, the “Great Repression,” with an “R,” because we are repressing this problem. But, that’s not the same as a cure. And what we’re going to see will look very disappointing, because we’ll be comparing it to the recovery of the sort that we used to see. In a traditional post-war recession, there would be a shock; the Fed would cut rates; there would be some kind of fiscal stimulus; and the economy would quite quickly recover. The reason that won’t work this time, and this is the key point, is that the whole U.S. economy became excessively leveraged in the last ten years. The debt burden, as a proportion of G.D.P., is in the region of 355 percent. So, debt is three and a half times the output of the economy. That’s some kind of historic maximum, and those debts aren’t going away.
MH: So we’ve all been bingeing on money that we didn’t have.
NF: That we borrowed. And we borrowed it from abroad, ultimately. This has been financed by borrowing from petrol exporters, and borrowing from Asian central banks, and sovereign wealth funds. But yeah, whether it was the people who refinanced their mortgages and spent the money that they pocketed, or banks that juiced their returns by piling on the leverage, the whole system became excessively indebted. And notice: what is the policy response? You guessed it, more debt. And, now it’s federal debt. So you end up in a situation where you’re curing a debt problem with more debt. Is that going to bring about a sustained recovery? I find that hard to believe.
MH: So I guess the unanswerable question is, what could you do to solve this problem?
NF: Well I’ll tell you what you have to do—you actually have to cancel the debt. There are historical precedents for this. Excessive debt burdens in the past tended to be public sector debts. What we’ve got now is an exceptional level of private debt. There’s never been an economy in history that’s had so much private debt. Britain and America today lead the world in the indebtedness of the household sector and the banking sector and the corporate sector. But debt is debt; it doesn’t even matter if it’s household debt or government. Once it gets to a certain level, there is a problem. In the past, when excessive debt burdens were accumulated by government, they tended to do one of two things: either they defaulted—this is the Argentine solution—where you say, “Ah, I’m sorry, I’m afraid we’re not going to be able to meet the interest payments this month, and never again will we make the interest payments.” The other scenario is inflation, where the real debt burden is eroded because the money that it’s denominated in loses value. I don’t think we’re really going to be out of the woods here until something of that sort happens to the huge debt burdens of the U.S. economy. Either these debts will have to be fundamentally written off in some way, or inflation will have to reduce the real burden.
MH: Don’t either of those scenarios spell the end of America as the world’s unrivalled superpower?
NF: Well, it certainly will be extremely painful. And that is why we have to look very closely at the attitude of the foreign creditors, because the U.S. owes the rest of the world a lot of money. Half the federal debt is held by foreigners. And if the U.S. either defaults on debt or allows the dollar to depreciate, the rest of the world is going to say, “Wait a second, you just screwed us.” And that’s, I think, the moment at which the United States experiences the British experience—when, in the dark days of the 60s and 70s, Britain fundamentally lost its credibility and ceased to be a financial great power. The I.M.F. had to come in, and the pound plunged to unheard-of depths.
MH: And George Soros became a billionaire, right?
NF: George Soros and others made some serious money off the back of it, certainly. I mean, somebody can make an awful lot of money off a massive dollar sell-off this year.
MH: How badly could the Chinese screw us if they wanted to?
NF: Well, they would have a difficulty in that they would kind of be screwing themselves. This is their dilemma. There’s a sort of “death embrace” quality to this, I think that someone’s talked about mutually assured financial destruction. The Chinese have got, we know, reserves in the region of $1.9 trillion, and 70 percent [of it is] dollar denominated, probably. That’s a huge pile of treasury bonds, not to mention Fannie and Freddie debt that they’ve accumulated over the last decade, when they’ve been intervening to keep their currency weak, and earning these vast amounts of foreign currency by running these trade surpluses. Now, politically, it might be quite tempting for the Chinese to phone up and say, “We really disagree with you about, let’s say, Taiwan and Japan and North Korea. You’d better listen to us, because otherwise, People’s Bank of China starts selling ten-year treasuries, and then you guys are dead.”
MH: But then their investments become worthless.
NF: Then you lose about five percent of China’s GDP, and that’s a hard sell—even for an authoritarian regime. So, they have a dilemma, and they are discovering the ancient truth that, when the debt is big enough, it’s the debtor who has the power, not the creditor. But, then again, these things aren’t always the result of calculated policy, decisions. There’s a sense in which a catalyst elsewhere could force the hand of People’s Bank of China. It doesn’t need to be the Chinese who start the run of the dollar. It could be Middle Eastern investors.
MH: In which case the Chinese might just follow and cut their losses.
NF: Well, they might have no alternative. They might be facing the decision that, “If we hold on, you know, we’re left really holding the hot potato.” So, that is a big worry of theirs. I know it’s a big worry of theirs. They’re thinking, “Can we somehow sneak out of some of these decisions without anybody noticing?” That’s why they’re so secretive. One of the great problems for anybody trying to make a decision about currency is, where else do you go? Short-term, it seems to me that everybody is kind of stuck trying to avoid this dollar crisis because it would be so expensive for those people who are invested in the U.S. But you shouldn’t assume—you can’t assume—that this is a stable state of affairs. It’s anything but that. It’s very, very precarious.
MH: Your book is about moments in history where there were innovations—the creation of money, the creation of credit, the creation of bonds, the stock market, and so on. And the people who were at the wheel during the run up to the bubble seemed convinced that they had overseen an innovation on this level. Now we’re seeing that maybe they didn’t. Fifty or 500 years from now, when someone writes a book like this one, do you think they’ll look back and see something valuable that came out of this?
NF: I’m sure they will. They’ll look back and they’ll say, “What an extraordinary proliferation of new financial instruments and business models there was between 1980 and 2006. And then the crisis came along, and it was like one of those events in natural history: asteroid hits the Earth, environment becomes a lot colder, only the strong survive. Some species will be extinct: investment banks are already extinct, and hedge funds will go extinct in six months. But they won’t all disappear, and the strong and well managed—and lucky!—will survive. The derivatives market will contract, but it won’t disappear, because those are useful things. They are simply insurance policies. Too many of them were sold at bad prices. It was clear that the models which were being used to price, say, credit default swaps were fundamentally unrealistic about the probability of defaults. That doesn’t mean that the underlying idea of being able to buy protection against default is a bad one. And that’s characteristic of financial history. If you go back to, say, the banking innovations of the 17th and 18th century, when new banks proliferated all over the English-speaking world, from Scotland to Massachusetts and beyond, banks were invented, and then along would come a financial crisis, and large numbers of them would go bust. But yeah, the ones that survived generally ended up being better banks, and I think that’s the cheerful news. This is an evolutionary system, there is an element of Darwinian, of the survival of the fittest, and although crises seem to be an integral part of the system, no crisis has been completely fatal to it.
‘MINSKY HALF-CENTURY’
http://www.debtdeflation.com/blogs/wp-content/uploads/papers/KeenAreWeItYetPaperFinal.pdf
http://www.smh.com.au/business/there-will-be-no-recovery-until-debt-tumour-is-excised-20090914-fnug.html
http://www.debtdeflation.com/blogs/research/
http://www.newyorker.com/talk/comment/2008/02/04/080204taco_talk_cassidy
http://www.boston.com/bostonglobe/ideas/articles/2009/09/13/why_capitalism_fails/
Why capitalism fails
by Stephen Mihm / September 13, 2009
Since the global financial system started unraveling in dramatic fashion two years ago, distinguished economists have suffered a crisis of their own. Ivy League professors who had trumpeted the dawn of a new era of stability have scrambled to explain how, exactly, the worst financial crisis since the Great Depression had ambushed their entire profession. Amid the hand-wringing and the self-flagellation, a few more cerebral commentators started to speak about the arrival of a “Minsky moment,” and a growing number of insiders began to warn of a coming “Minsky meltdown.”
“Minsky” was shorthand for Hyman Minsky, a hitherto obscure macroeconomist who died over a decade ago. Many economists had never heard of him when the crisis struck, and he remains a shadowy figure in the profession. But lately he has begun emerging as perhaps the most prescient big-picture thinker about what, exactly, we are going through. A contrarian amid the conformity of postwar America, an expert in the then-unfashionable subfields of finance and crisis, Minsky was one economist who saw what was coming. He predicted, decades ago, almost exactly the kind of meltdown that recently hammered the global economy.
In recent months Minsky’s star has only risen. Nobel Prize-winning economists talk about incorporating his insights, and copies of his books are back in print and selling well. He’s gone from being a nearly forgotten figure to a key player in the debate over how to fix the financial system. But if Minsky was as right as he seems to have been, the news is not exactly encouraging. He believed in capitalism, but also believed it had almost a genetic weakness. Modern finance, he argued, was far from the stabilizing force that mainstream economics portrayed: rather, it was a system that created the illusion of stability while simultaneously creating the conditions for an inevitable and dramatic collapse. In other words, the one person who foresaw the crisis also believed that our whole financial system contains the seeds of its own destruction. “Instability,” he wrote, “is an inherent and inescapable flaw of capitalism.”
Minsky’s vision might have been dark, but he was not a fatalist; he believed it was possible to craft policies that could blunt the collateral damage caused by financial crises. But with a growing number of economists eager to declare the recession over, and the crisis itself apparently behind us, these policies may prove as discomforting as the theories that prompted them in the first place. Indeed, as economists re-embrace Minsky’s prophetic insights, it is far from clear that they’re ready to reckon with the full implications of what he saw. In an ideal world, a profession dedicated to the study of capitalism would be as freewheeling and innovative as its ostensible subject. But economics has often been subject to powerful orthodoxies, and never more so than when Minsky arrived on the scene.
That orthodoxy, born in the years after World War II, was known as the neoclassical synthesis. The older belief in a self-regulating, self-stabilizing free market had selectively absorbed a few insights from John Maynard Keynes, the great economist of the 1930s who wrote extensively of the ways that capitalism might fail to maintain full employment. Most economists still believed that free-market capitalism was a fundamentally stable basis for an economy, though thanks to Keynes, some now acknowledged that government might under certain circumstances play a role in keeping the economy – and employment – on an even keel. Economists like Paul Samuelson became the public face of the new establishment; he and others at a handful of top universities became deeply influential in Washington. In theory, Minsky could have been an academic star in this new establishment: Like Samuelson, he earned his doctorate in economics at Harvard University, where he studied with legendary Austrian economist Joseph Schumpeter, as well as future Nobel laureate Wassily Leontief.
But Minsky was cut from different cloth than many of the other big names. The descendent of immigrants from Minsk, in modern-day Belarus, Minsky was a red-diaper baby, the son of Menshevik socialists. While most economists spent the 1950s and 1960s toiling over mathematical models, Minsky pursued research on poverty, hardly the hottest subfield of economics. With long, wild, white hair, Minsky was closer to the counterculture than to mainstream economics. He was, recalls the economist L. Randall Wray, a former student, a “character.” So while his colleagues from graduate school went on to win Nobel prizes and rise to the top of academia, Minsky languished. He drifted from Brown to Berkeley and eventually to Washington University. Indeed, many economists weren’t even aware of his work. One assessment of Minsky published in 1997 simply noted that his “work has not had a major influence in the macroeconomic discussions of the last thirty years.”
Yet he was busy. In addition to poverty, Minsky began to delve into the field of finance, which despite its seeming importance had no place in the theories formulated by Samuelson and others. He also began to ask a simple, if disturbing question: “Can ‘it’ happen again?” – where “it” was, like Harry Potter’s nemesis Voldemort, the thing that could not be named: the Great Depression. In his writings, Minsky looked to his intellectual hero, Keynes, arguably the greatest economist of the 20th century. But where most economists drew a single, simplistic lesson from Keynes – that government could step in and micromanage the economy, smooth out the business cycle, and keep things on an even keel – Minsky had no interest in what he and a handful of other dissident economists came to call “bastard Keynesianism.”
Instead, Minsky drew his own, far darker, lessons from Keynes’s landmark writings, which dealt not only with the problem of unemployment, but with money and banking. Although Keynes had never stated this explicitly, Minsky argued that Keynes’s collective work amounted to a powerful argument that capitalism was by its very nature unstable and prone to collapse. Far from trending toward some magical state of equilibrium, capitalism would inevitably do the opposite. It would lurch over a cliff. This insight bore the stamp of his advisor Joseph Schumpeter, the noted Austrian economist now famous for documenting capitalism’s ceaseless process of “creative destruction.” But Minsky spent more time thinking about destruction than creation. In doing so, he formulated an intriguing theory: not only was capitalism prone to collapse, he argued, it was precisely its periods of economic stability that would set the stage for monumental crises.
Minsky called his idea the “Financial Instability Hypothesis.” In the wake of a depression, he noted, financial institutions are extraordinarily conservative, as are businesses. With the borrowers and the lenders who fuel the economy all steering clear of high-risk deals, things go smoothly: loans are almost always paid on time, businesses generally succeed, and everyone does well. That success, however, inevitably encourages borrowers and lenders to take on more risk in the reasonable hope of making more money. As Minsky observed, “Success breeds a disregard of the possibility of failure.” As people forget that failure is a possibility, a “euphoric economy” eventually develops, fueled by the rise of far riskier borrowers – what he called speculative borrowers, those whose income would cover interest payments but not the principal; and those he called “Ponzi borrowers,” those whose income could cover neither, and could only pay their bills by borrowing still further. As these latter categories grew, the overall economy would shift from a conservative but profitable environment to a much more freewheeling system dominated by players whose survival depended not on sound business plans, but on borrowed money and freely available credit.
Once that kind of economy had developed, any panic could wreck the market. The failure of a single firm, for example, or the revelation of a staggering fraud could trigger fear and a sudden, economy-wide attempt to shed debt. This watershed moment – what was later dubbed the “Minsky moment” – would create an environment deeply inhospitable to all borrowers. The speculators and Ponzi borrowers would collapse first, as they lost access to the credit they needed to survive. Even the more stable players might find themselves unable to pay their debt without selling off assets; their forced sales would send asset prices spiraling downward, and inevitably, the entire rickety financial edifice would start to collapse. Businesses would falter, and the crisis would spill over to the “real” economy that depended on the now-collapsing financial system.
From the 1960s onward, Minsky elaborated on this hypothesis. At the time he believed that this shift was already underway: postwar stability, financial innovation, and the receding memory of the Great Depression were gradually setting the stage for a crisis of epic proportions. Most of what he had to say fell on deaf ears. The 1960s were an era of solid growth, and although the economic stagnation of the 1970s was a blow to mainstream neo-Keynesian economics, it did not send policymakers scurrying to Minsky. Instead, a new free market fundamentalism took root: government was the problem, not the solution. Moreover, the new dogma coincided with a remarkable era of stability. The period from the late 1980s onward has been dubbed the “Great Moderation,” a time of shallow recessions and great resilience among most major industrial economies. Things had never been more stable. The likelihood that “it” could happen again now seemed laughable.
Yet throughout this period, the financial system – not the economy, but finance as an industry – was growing by leaps and bounds. Minsky spent the last years of his life, in the early 1990s, warning of the dangers of securitization and other forms of financial innovation, but few economists listened. Nor did they pay attention to consumers’ and companies’ growing dependence on debt, and the growing use of leverage within the financial system. By the end of the 20th century, the financial system that Minsky had warned about had materialized, complete with speculative borrowers, Ponzi borrowers, and precious few of the conservative borrowers who were the bedrock of a truly stable economy. Over decades, we really had forgotten the meaning of risk. When storied financial firms started to fall, sending shockwaves through the “real” economy, his predictions started to look a lot like a road map. “This wasn’t a Minsky moment,” explains Randall Wray. “It was a Minsky half-century.”
Minsky is now all the rage. A year ago, an influential Financial Times columnist confided to readers that rereading Minsky’s 1986 “masterpiece” – “Stabilizing an Unstable Economy” – “helped clear my mind on this crisis.” Others joined the chorus. Earlier this year, two economic heavyweights – Paul Krugman and Brad DeLong – both tipped their hats to him in public forums. Indeed, the Nobel Prize-winning Krugman titled one of the Robbins lectures at the London School of Economics “The Night They Re-read Minsky.” Today most economists, it’s safe to say, are probably reading Minsky for the first time, trying to fit his unconventional insights into the theoretical scaffolding of their profession. If Minsky were alive today, he would no doubt applaud this belated acknowledgment, even if it has come at a terrible cost. As he once wryly observed, “There is nothing wrong with macroeconomics that another depression [won’t] cure.”
But does Minsky’s work offer us any practical help? If capitalism is inherently self-destructive and unstable – never mind that it produces inequality and unemployment, as Keynes had observed – now what? After spending his life warning of the perils of the complacency that comes with stability – and having it fall on deaf ears – Minsky was understandably pessimistic about the ability to short-circuit the tragic cycle of boom and bust. But he did believe that much could be done to ameliorate the damage. To prevent the Minsky moment from becoming a national calamity, part of his solution (which was shared with other economists) was to have the Federal Reserve – what he liked to call the “Big Bank” – step into the breach and act as a lender of last resort to firms under siege. By throwing lines of liquidity to foundering firms, the Federal Reserve could break the cycle and stabilize the financial system. It failed to do so during the Great Depression, when it stood by and let a banking crisis spiral out of control. This time, under the leadership of Ben Bernanke – like Minsky, a scholar of the Depression – it took a very different approach, becoming a lender of last resort to everything from hedge funds to investment banks to money market funds.
Minsky’s other solution, however, was considerably more radical and less palatable politically. The preferred mainstream tactic for pulling the economy out of a crisis was – and is – based on the Keynesian notion of “priming the pump” by sending money that will employ lots of high-skilled, unionized labor – by building a new high-speed train line, for example. Minsky, however, argued for a “bubble-up” approach, sending money to the poor and unskilled first. The government – or what he liked to call “Big Government” – should become the “employer of last resort,” he said, offering a job to anyone who wanted one at a set minimum wage. It would be paid to workers who would supply child care, clean streets, and provide services that would give taxpayers a visible return on their dollars. In being available to everyone, it would be even more ambitious than the New Deal, sharply reducing the welfare rolls by guaranteeing a job for anyone who was able to work. Such a program would not only help the poor and unskilled, he believed, but would put a floor beneath everyone else’s wages too, preventing salaries of more skilled workers from falling too precipitously, and sending benefits up the socioeconomic ladder.
While economists may be acknowledging some of Minsky’s points on financial instability, it’s safe to say that even liberal policymakers are still a long way from thinking about such an expanded role for the American government. If nothing else, an expensive full-employment program would veer far too close to socialism for the comfort of politicians. For his part, Wray thinks that the critics are apt to misunderstand Minsky. “He saw these ideas as perfectly consistent with capitalism,” says Wray. “They would make capitalism better.” But not perfect. Indeed, if there’s anything to be drawn from Minsky’s collected work, it’s that perfection, like stability and equilibrium, are mirages. Minsky did not share his profession’s quaint belief that everything could be reduced to a tidy model, or a pat theory. His was a kind of existential economics: capitalism, like life itself, is difficult, even tragic. “There is no simple answer to the problems of our capitalism,” wrote Minsky. “There is no solution that can be transformed into a catchy phrase and carried on banners.” It’s a sentiment that may limit the extent to which Minsky becomes part of any new orthodoxy. But that’s probably how he would have preferred it, believes liberal economist James Galbraith. “I think he would resist being domesticated,” says Galbraith. “He spent his career in professional isolation.”
HISTORY of DEBT
http://www.eurozine.com/articles/2009-08-20-graeber-en.html
Debt: The first five thousand years
by David Graeber / 08.20.2009
What follows is a fragment of a much larger project of research on debt and debt money in human history. The first and overwhelming conclusion of this project is that in studying economic history, we tend to systematically ignore the role of violence, the absolutely central role of war and slavery in creating and shaping the basic institutions of what we now call “the economy”. What’s more, origins matter. The violence may be invisible, but it remains inscribed in the very logic of our economic common sense, in the apparently self-evident nature of institutions that simply would never and could never exist outside of the monopoly of violence – but also, the systematic threat of violence – maintained by the contemporary state.
Let me start with the institution of slavery, whose role, I think, is key. In most times and places, slavery is seen as a consequence of war. Sometimes most slaves actually are war captives, sometimes they are not, but almost invariably, war is seen as the foundation and justification of the institution. If you surrender in war, what you surrender is your life; your conqueror has the right to kill you, and often will. If he chooses not to, you literally owe your life to him; a debt conceived as absolute, infinite, irredeemable. He can in principle extract anything he wants, and all debts – obligations – you may owe to others (your friends, family, former political allegiances), or that others owe you, are seen as being absolutely negated. Your debt to your owner is all that now exists.
This sort of logic has at least two very interesting consequences, though they might be said to pull in rather contrary directions. First of all, as we all know, it is another typical – perhaps defining – feature of slavery that slaves can be bought or sold. In this case, absolute debt becomes (in another context, that of the market) no longer absolute. In fact, it can be precisely quantified. There is good reason to believe that it was just this operation that made it possible to create something like our contemporary form of money to begin with, since what anthropologists used to refer to as “primitive money”, the kind that one finds in stateless societies (Solomon Island feather money, Iroquois wampum), was mostly used to arrange marriages, resolve blood feuds, and fiddle with other sorts of relations between people, rather than to buy and sell commodities. For instance, if slavery is debt, then debt can lead to slavery. A Babylonian peasant might have paid a handy sum in silver to his wife’s parents to officialise the marriage, but he in no sense owned her. He certainly couldn’t buy or sell the mother of his children. But all that would change if he took out a loan. Were he to default, his creditors could first remove his sheep and furniture, then his house, fields and orchards, and finally take his wife, children, and even himself as debt peons until the matter was settled (which, as his resources vanished, of course became increasingly difficult to do). Debt was the hinge that made it possible to imagine money in anything like the modern sense, and therefore, also, to produce what we like to call the market: an arena where anything can be bought and sold, because all objects are (like slaves) disembedded from their former social relations and exist only in relation to money.
But at the same time the logic of debt as conquest can, as I mentioned, pull another way. Kings, throughout history, tend to be profoundly ambivalent towards allowing the logic of debt to get completely out of hand. This is not because they are hostile to markets. On the contrary, they normally encourage them, for the simple reason that governments find it inconvenient to levy everything they need (silks, chariot wheels, flamingo tongues, lapis lazuli) directly from their subject population; it’s much easier to encourage markets and then buy them. Early markets often followed armies or royal entourages, or formed near palaces or at the fringes of military posts. This actually helps explain the rather puzzling behaviour on the part of royal courts: after all, since kings usually controlled the gold and silver mines, what exactly was the point of stamping bits of the stuff with your face on it, dumping it on the civilian population, and then demanding they give it back to you again as taxes? It only makes sense if levying taxes was really a way to force everyone to acquire coins, so as to facilitate the rise of markets, since markets were convenient to have around. However, for our present purposes, the critical question is: how were these taxes justified? Why did subjects owe them, what debt were they discharging when they were paid? Here we return again to right of conquest. (Actually, in the ancient world, free citizens – whether in Mesopotamia, Greece, or Rome – often did not have to pay direct taxes for this very reason, but obviously I’m simplifying here.) If kings claimed to hold the power of life and death over their subjects by right of conquest, then their subjects’ debts were, also, ultimately infinite; and also, at least in that context, their relations to one another, what they owed to one another, was unimportant. All that really existed was their relation to the king. This in turn explains why kings and emperors invariably tried to regulate the powers that masters had over slaves, and creditors over debtors. At the very least they would always insist, if they had the power, that those prisoners who had already had their lives spared could no longer be killed by their masters. In fact, only rulers could have arbitrary power over life and death. One’s ultimate debt was to the state; it was the only one that was truly unlimited, that could make absolute, cosmic, claims.
The reason I stress this is because this logic is still with us. When we speak of a “society” (French society, Jamaican society) we are really speaking of people organised by a single nation state. That is the tacit model, anyway. “Societies” are really states, the logic of states is that of conquest, the logic of conquest is ultimately identical to that of slavery. True, in the hands of state apologists, this becomes transformed into a notion of a more benevolent “social debt”. Here there is a little story told, a kind of myth. We are all born with an infinite debt to the society that raised, nurtured, fed and clothed us, to those long dead who invented our language and traditions, to all those who made it possible for us to exist. In ancient times we thought we owed this to the gods (it was repaid in sacrifice, or, sacrifice was really just the payment of interest – ultimately, it was repaid by death). Later the debt was adopted by the state, itself a divine institution, with taxes substituted for sacrifice, and military service for one’s debt of life. Money is simply the concrete form of this social debt, the way that it is managed. Keynesians like this sort of logic. So do various strains of socialist, social democrats, even crypto-fascists like Auguste Comte (the first, as far as I am aware, to actually coin the phrase “social debt”). But the logic also runs through much of our common sense: consider for instance, the phrase, “to pay one’s debt to society”, or, “I felt I owed something to my country”, or, “I wanted to give something back.” Always, in such cases, mutual rights and obligations, mutual commitments – the kind of relations that genuinely free people could make with one another – tend to be subsumed into a conception of “society” where we are all equal only as absolute debtors before the (now invisible) figure of the king, who stands in for your mother, and by extension, humanity.
What I am suggesting, then, is that while the claims of the impersonal market and the claims of “society” are often juxtaposed – and certainly have had a tendency to jockey back and forth in all sorts of practical ways – they are both ultimately founded on a very similar logic of violence. Neither is this a mere matter of historical origins that can be brushed away as inconsequential: neither states nor markets can exist without the constant threat of force. One might ask, then, what is the alternative?
Towards a history of virtual money
Here I can return to my original point: that money did not originally appear in this cold, metal, impersonal form. It originally appears in the form of a measure, an abstraction, but also as a relation (of debt and obligation) between human beings. It is important to note that historically it is commodity money that has always been most directly linked to violence. As one historian put it, “bullion is the accessory of war, and not of peaceful trade.”[1] The reason is simple. Commodity money, particularly in the form of gold and silver, is distinguished from credit money most of all by one spectacular feature: it can be stolen. Since an ingot of gold or silver is an object without a pedigree, throughout much of history bullion has served the same role as the contemporary drug dealer’s suitcase full of dollar bills, as an object without a history that will be accepted in exchange for other valuables just about anywhere, with no questions asked. As a result, one can see the last 5 000 years of human history as the history of a kind of alternation. Credit systems seem to arise, and to become dominant, in periods of relative social peace, across networks of trust, whether created by states or, in most periods, transnational institutions, whilst precious metals replace them in periods characterised by widespread plunder. Predatory lending systems certainly exist at every period, but they seem to have had the most damaging effects in periods when money was most easily convertible into cash.
So as a starting point to any attempt to discern the great rhythms that define the current historical moment, let me propose the following breakdown of Eurasian history according to the alternation between periods of virtual and metal money:
I. Age of the First Agrarian Empires (3500-800 BCE). Dominant money form: Virtual credit money
Our best information on the origins of money goes back to ancient Mesopotamia, but there seems no particular reason to believe matters were radically different in Pharaonic Egypt, Bronze Age China, or the Indus Valley. The Mesopotamian economy was dominated by large public institutions (Temples and Palaces) whose bureaucratic administrators effectively created money of account by establishing a fixed equivalent between silver and the staple crop, barley. Debts were calculated in silver, but silver was rarely used in transactions. Instead, payments were made in barley or in anything else that happened to be handy and acceptable. Major debts were recorded on cuneiform tablets kept as sureties by both parties to the transaction.
Certainly, markets did exist. Prices of certain commodities that were not produced within Temple or Palace holdings, and thus not subject to administered price schedules, would tend to fluctuate according to the vagaries of supply and demand. But most actual acts of everyday buying and selling, particularly those that were not carried out between absolute strangers, appear to have been made on credit. “Ale women”, or local innkeepers, served beer, for example, and often rented rooms; customers ran up a tab; normally, the full sum was dispatched at harvest time. Market vendors presumably acted as they do in small-scale markets in Africa, or Central Asia, today, building up lists of trustworthy clients to whom they could extend credit. The habit of money at interest also originates in Sumer – it remained unknown, for example, in Egypt. Interest rates, fixed at 20 percent, remained stable for 2,000 years. (This was not a sign of government control of the market: at this stage, institutions like this were what made markets possible.) This, however, led to some serious social problems. In years with bad harvests especially, peasants would start becoming hopelessly indebted to the rich, and would have to surrender their farms and, ultimately, family members, in debt bondage. Gradually, this condition seems to have come to a social crisis – not so much leading to popular uprisings, but to common people abandoning the cities and settled territory entirely and becoming semi-nomadic “bandits” and raiders. It soon became traditional for each new ruler to wipe the slate clean, cancel all debts, and declare a general amnesty or “freedom”, so that all bonded labourers could return to their families. (It is significant here that the first word for “freedom” known in any human language, the Sumerian amarga, literally means “return to mother”.) Biblical prophets instituted a similar custom, the Jubilee, whereby after seven years all debts were similarly cancelled. This is the direct ancestor of the New Testament notion of “redemption”. As economist Michael Hudson has pointed out, it seems one of the misfortunes of world history that the institution of lending money at interest disseminated out of Mesopotamia without, for the most part, being accompanied by its original checks and balances.
II. Axial Age (800 BCE – 600 CE). Dominant money form: Coinage and metal bullion
This was the age that saw the emergence of coinage, as well as the birth, in China, India and the Middle East, of all major world religions.[2] From the Warring States period in China, to fragmentation in India, and to the carnage and mass enslavement that accompanied the expansion (and later, dissolution) of the Roman Empire, it was a period of spectacular creativity throughout most of the world, but of almost equally spectacular violence. Coinage, which allowed for the actual use of gold and silver as a medium of exchange, also made possible the creation of markets in the now more familiar, impersonal sense of the term. Precious metals were also far more appropriate for an age of generalised warfare, for the obvious reason that they could be stolen. Coinage, certainly, was not invented to facilitate trade (the Phoenicians, consummate traders of the ancient world, were among the last to adopt it). It appears to have been first invented to pay soldiers, probably first of all by rulers of Lydia in Asia Minor to pay their Greek mercenaries. Carthage, another great trading nation, only started minting coins very late, and then explicitly to pay its foreign soldiers.
Throughout antiquity one can continue to speak of what Geoffrey Ingham has dubbed the “military-coinage complex”. He may have been better to call it a “military-coinage-slavery complex”, since the diffusion of new military technologies (Greek hoplites, Roman legions) was always closely tied to the capture and marketing of slaves. The other major source of slaves was debt: now that states no longer periodically wiped the slates clean, those not lucky enough to be citizens of the major military city-states – who were generally protected from predatory lenders – were fair game. The credit systems of the Near East did not crumble under commercial competition; they were destroyed by Alexander’s armies – armies that required half a ton of silver bullion per day in wages. The mines where the bullion was produced were generally worked by slaves. Military campaigns in turn ensured an endless flow of new slaves. Imperial tax systems, as noted, were largely designed to force their subjects to create markets, so that soldiers (and also, of course, government officials) would be able to use that bullion to buy anything they wanted. The kind of impersonal markets that once tended to spring up between societies, or at the fringes of military operations, now began to permeate society as a whole.
However tawdry their origins, the creation of new media of exchange – coinage appeared almost simultaneously in Greece, India, and China – appears to have had profound intellectual effects. Some have even gone so far as to argue that Greek philosophy was itself made possible by conceptual innovations introduced by coinage. The most remarkable pattern, though, is the emergence, in almost the exact times and places where one also sees the early spread of coinage, of what were to become modern world religions: prophetic Judaism, Christianity, Buddhism, Jainism, Confucianism, Taoism, and eventually, Islam. While the precise links are yet to be fully explored, in certain ways, these religions appear to have arisen in direct reaction to the logic of the market. To put the matter somewhat crudely: if one relegates a certain social space simply to the selfish acquisition of material things, it is almost inevitable that soon someone else will come to set aside another domain in which to preach that, from the perspective of ultimate values, material things are unimportant, and selfishness – or even the self – illusory.
III. The Middle Ages (600 CE – 1500 CE). The return to virtual credit money
If the Axial Age saw the emergence of complementary ideals of commodity markets and universal world religions, the Middle Ages[3] were the period in which those two institutions began to merge. Religions began to take over the market systems. Everything from international trade to the organisation of local fairs increasingly came to be carried out through social networks defined and regulated by religious authorities. This enabled, in turn, the return throughout Eurasia of various forms of virtual credit money.
In Europe, where all this took place under the aegis of Christendom, coinage was only sporadically, and unevenly, available. Prices after 800 AD were calculated largely in terms of an old Carolingian currency that no longer existed (it was actually referred to at the time as “imaginary money”), but ordinary day-to-day buying and selling was carried out mainly through other means. One common expedient, for example, was the use of tally-sticks, notched pieces of wood that were broken in two as records of debt, with half being kept by the creditor, half by the debtor. Such tally-sticks were still in common use in much of England well into the 16th century. Larger transactions were handled through bills of exchange, with the great commercial fairs serving as their clearing houses. The Church, meanwhile, provided a legal framework, enforcing strict controls on the lending of money at interest and prohibitions on debt bondage.
The real nerve centre of the Medieval world economy, though, was the Indian Ocean, which along with the Central Asia caravan routes connected the great civilisations of India, China, and the Middle East. Here, trade was conducted through the framework of Islam, which not only provided a legal structure highly conducive to mercantile activities (while absolutely forbidding the lending of money at interest), but allowed for peaceful relations between merchants over a remarkably large part of the globe, allowing the creation of a variety of sophisticated credit instruments. Actually, Western Europe was, as in so many things, a relative late-comer in this regard: most of the financial innovations that reached Italy and France in the 11th and 12th centuries had been in common use in Egypt or Iraq since the 8th or 9th centuries. The word “cheque”, for example, derives from the Arab sakk, and appeared in English only around 1220 AD.
The case of China is even more complicated: the Middle Ages there began with the rapid spread of Buddhism, which, while it was in no position to enact laws or regulate commerce, did quickly move against local usurers by its invention of the pawn shop – the first pawn shops being based in Buddhist temples as a way of offering poor farmers an alternative to the local usurer. Before long, though, the state reasserted itself, as the state always tends to do in China. But as it did so, it not only regulated interest rates and attempted to abolish debt peonage, it moved away from bullion entirely by inventing paper money. All this was accompanied by the development, again, of a variety of complex financial instruments.
All this is not to say that this period did not see its share of carnage and plunder (particularly during the great nomadic invasions) or that coinage was not, in many times and places, an important medium of exchange. Still, what really characterises the period appears to be a movement in the other direction. Most of the Medieval period saw money largely delinked from coercive institutions. Money changers, one might say, were invited back into the temples, where they could be monitored. The result was a flowering of institutions premised on a much higher degree of social trust.”
IV. Age of European Empires (1500-1971). The return of precious metals
With the advent of the great European empires – Iberian, then North Atlantic – the world saw both a reversion to mass enslavement, plunder, and wars of destruction, and the consequent rapid return of gold and silver bullion as the main form of currency. Historical investigation will probably end up demonstrating that the origins of these transformations were more complicated than we ordinarily assume. Some of this was beginning to happen even before the conquest of the New World. One of the main factors of the movement back to bullion, for example, was the emergence of popular movements during the early Ming dynasty, in the 15th and 16th centuries, that ultimately forced the government to abandon not only paper money but any attempt to impose its own currency. This led to the reversion of the vast Chinese market to an uncoined silver standard. Since taxes were also gradually commuted into silver, it soon became the more or less official Chinese policy to try to bring as much silver into the country as possible, so as to keep taxes low and prevent new outbreaks of social unrest. The sudden enormous demand for silver had effects across the globe. Most of the precious metals looted by the conquistadors and later extracted by the Spanish from the mines of Mexico and Potosi (at almost unimaginable cost in human lives) ended up in China. These global scale connections that eventually developed across the Atlantic, Pacific, and Indian Oceans have of course been documented in great detail. The crucial point is that the delinking of money from religious institutions, and its relinking with coercive ones (especially the state), was here accompanied by an ideological reversion to “metallism”.[4]
Credit, in this context, was on the whole an affair of states that were themselves run largely by deficit financing, a form of credit which was, in turn, invented to finance increasingly expensive wars. Internationally the British Empire was steadfast in maintaining the gold standard through the 19th and early 20th centuries, and great political battles were fought in the United States over whether the gold or silver standard should prevail.
This was also, obviously, the period of the rise of capitalism, the industrial revolution, representative democracy, and so on. What I am trying to do here is not to deny their importance, but to provide a framework for seeing such familiar events in a less familiar context. It makes it easier, for instance, to detect the ties between war, capitalism, and slavery. The institution of wage labour, for instance, has historically emerged from within that of slavery (the earliest wage contracts we know of, from Greece to the Malay city states, were actually slave rentals), and it has also tended, historically, to be intimately tied to various forms of debt peonage – as indeed it remains today. The fact that we have cast such institutions in a language of freedom does not mean that what we now think of as economic freedom does not ultimately rest on a logic that has for most of human history been considered the very essence of slavery.
Current Era (1971 onwards). The empire of debt
The current era might be said to have been initiated on 15 August 1971, when US President Richard Nixon officially suspended the convertibility of the dollar into gold and effectively created the current floating currency regimes. We have returned, at any rate, to an age of virtual money, in which consumer purchases in wealthy countries rarely involve even paper money, and national economies are driven largely by consumer debt. It’s in this context that we can talk about the “financialisation” of capital, whereby speculation in currencies and financial instruments becomes a domain unto itself, detached from any immediate relation with production or even commerce. This is of course the sector that has entered into crisis today.
What can we say for certain about this new era? So far, very, very little. Thirty or forty years is nothing in terms of the scale we have been dealing with. Clearly, this period has only just begun. Still, the foregoing analysis, however crude, does allow us to begin to make some informed suggestions. Historically, as we have seen, ages of virtual, credit money have also involved creating some sort of overarching institutions – Mesopotamian sacred kingship, Mosaic jubilees, Sharia or Canon Law – that place some sort of controls on the potentially catastrophic social consequences of debt. Almost invariably, they involve institutions (usually not strictly coincident to the state, usually larger) to protect debtors. So far the movement this time has been the other way around: starting with the ’80s we have begun to see the creation of the first effective planetary administrative system, operating through the IMF, World Bank, corporations and other financial institutions, largely in order to protect the interests of creditors. However, this apparatus was very quickly thrown into crisis, first by the very rapid development of global social movements (the alter-globalisation movement), which effectively destroyed the moral authority of institutions like the IMF and left many of them very close to bankrupt, and now by the current banking crisis and global economic collapse. While the new age of virtual money has only just begun and the long-term consequences are as yet entirely unclear, we can already say one or two things. The first is that a movement towards virtual money is not in itself, necessarily, an insidious effect of capitalism. In fact, it might well mean exactly the opposite. For much of human history, systems of virtual money were designed and regulated to ensure that nothing like capitalism could ever emerge to begin with – at least not as it appears in its present form, with most of the world’s population placed in a condition that would in many other periods of history be considered tantamount to slavery. The second point is to underline the absolutely crucial role of violence in defining the very terms by which we imagine both “society” and “markets” – in fact, many of our most elementary ideas of freedom. A world less entirely pervaded by violence would rapidly begin to develop other institutions. Finally, thinking about debt outside the twin intellectual straitjackets of state and market opens up exciting possibilities. For instance, we can ask: in a society in which that foundation of violence had finally been yanked away, what exactly would free men and women owe each other? What sort of promises and commitments should they make to each other? Let us hope that everyone will someday be in a position to start asking such questions. At times like this, you never know.
[1] Geoffrey W. Gardiner, “The Primacy of Trade Debts in the Development of Money”, in Randall Wray (ed.), Credit and State Theories of Money: The Contributions of A. Mitchell Innes, Cheltenham: Elgar, 2004, p.134.
[2] The phrase the “Axial Age” was originally coined by Karl Jaspers to describe the relatively brief period between 800 BCE – 200 BCE in which, he believed, just about all the main philosophical traditions we are familiar with today arose simultaneously in China, India, and the Eastern Mediterranean. Here, I am using it in Lewis Mumford’s more expansive use of the term as the period that saw the birth of all existing world religions, stretching roughly from the time of Zoroaster to that of Mohammed.
[3] I am here relegating most of what is generally referred to as the “Dark Ages” in Europe into the earlier period, characterised by predatory militarism and the consequent importance of bullion: the Viking raids, and the famous extraction of danegeld from England in the 800s, might be seen as one the last manifestations of an age where predatory militarism went hand and hand with hoards of gold and silver bullion.
[4] The myth of barter and commodity theories of money was of course developed in this period.